WorldWideScience

Sample records for evolving software reengineering

  1. Evolving software reengineering technology for the emerging innovative-competitive era

    Science.gov (United States)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex

  2. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  3. Hospital reengineering: an evolving management innovation: history, current status and future direction.

    Science.gov (United States)

    Walston, S L; Urden, L D; Sullivan, P

    2001-01-01

    This article summarizes six years of research on reengineering in hospitals and is the result of two national surveys and eighteen site visits to hospitals that engaged in reengineering in the 1990s. The research shows that actual hospital reengineering differs substantially from that which was initially proposed by early promoters of reengineering. However, this evolved reengineering continues to be implemented by the majority of hospitals in the United States. The authors illustrate how extensive reductions of managers and changes of nursing models have been in the past six years. Data comparing financial and cost competitiveness changes are also shown. The authors then explore the continued experiences of two early proponents of reengineering and find that their competitive outcomes to be in contrast with their early statements. Finally, the authors suggest a number of reasons that may impact on the success or failure of reengineering.

  4. The software-cycle model for re-engineering and reuse

    Science.gov (United States)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  5. IT & C Projects Duration Assessment Based on Audit and Software Reengineering

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available This paper analyses the effect of applying the core elements of software engineering and reengineering, probabilistic simulations and system development auditing to software development projects. Our main focus is reducing software development project duration. Due to the fast changing economy, the need for efficiency and productivity is greater than ever. Optimal allocation of resources has proved to be the main element contributing to an increase in efficiency.

  6. Pattern-Oriented Reengineering of a Network System

    Directory of Open Access Journals (Sweden)

    Chung-Horng Lung

    2004-08-01

    Full Text Available Reengineering is to reorganize and modify existing systems to enhance them or to make them more maintainable. Reengineering is usually necessary as systems evolve due to changes in requirements, technologies, and/or personnel. Design patterns capture recurring structures and dynamics among software participants to facilitate reuse of successful designs. Design patterns are common and well studied in network systems. In this project, we reengineer part of a network system with some design patterns to support future evolution and performance improvement. We start with reverse engineering effort to understand the system and recover its high level architecture. Then we apply concurrent and networked design patterns to restructure the main sub-system. Those patterns include Half-Sync/Half-Async, Monitor Object, and Scoped Locking idiom. The resulting system is more maintainable and has better performance.

  7. Using a Foundational Ontology for Reengineering a Software Enterprise Ontology

    Science.gov (United States)

    Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo

    The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.

  8. REENGINEERING OF THE AIR SIMULATORS LEGACY SOFTWARE

    Directory of Open Access Journals (Sweden)

    Nikolay O. Sidorov

    2008-02-01

    Full Text Available  There are the technical complexes consisting of components, parts of which are actively used, but the rest has lost working capacity owing to moral and physical deterioration. An example of such a complex is the aviation-flight complex "plane-simulator". High cost of components which continue to be used (plane do the actual task of restoring and supporting the out-of-order components (simulator. The considerable part of such complexes is the software, which owing to replacement of the obsolete and physically worn out hardware requires the rework. The rework method is reengineering.

  9. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  10. The present status of software engineering

    CERN Document Server

    Pressman, Roger S

    1991-01-01

    In this seminar, we will discuss the present status and future directions of software engeneering and CASE. Key topics to be discussed include: new paradigms for software engineering; software metrics; process assessment; the current state of analysis and design methods; reusability and re-engineering; formal methods. Among the questions to be answered are: How will software engineering change as the 1990s progress? What are the "technology drivers"? What will analysis, design, coding, testing, quality assurance and software management look like in the year 2000? How will CASE tools evolve in the 1990s and will they be as "integrated" as many people believe? How can you position your Organization to accommodate the coming changes?

  11. Evolvability as a Quality Attribute of Software Architectures

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Duchien, Laurence; D'Hondt, Maja; Mens, Tom

    We review the definition of evolvability as it appears on the literature. In particular, the concept of software evolvability is compared with other system quality attributes, such as adaptability, maintainability and modifiability.

  12. Software engineering a practitioner's approach

    CERN Document Server

    Pressman, Roger S

    1997-01-01

    This indispensable guide to software engineering exploration enables practitioners to navigate the ins and outs of this rapidly changing field. Pressman's fully revised and updated Fourth Edition provides in-depth coverage of every important management and technical topic in software engineering. Moreover, readers will find the inclusion of the hottest developments in the field such as: formal methods and cleanroom software engineering, business process reengineering, and software reengineering.

  13. Reengineering Hanford

    International Nuclear Information System (INIS)

    Badalamente, R.V.; Carson, M.L.; Rhoads, R.E.

    1995-03-01

    The Department of Energy Richland Operations Office is in the process of reengineering its Hanford Site operations. There is a need to fundamentally rethink and redesign environmental restoration and waste management processes to achieve dramatic improvements in the quality, cost-effectiveness, and timeliness of the environmental services and products that make cleanup possible. Hanford is facing the challenge of reengineering in a complex environment in which major processes cuts across multiple government and contractor organizations and a variety of stakeholders and regulators have a great influence on cleanup activities. By doing the upfront work necessary to allow effective reengineering, Hanford is increasing the probability of its success

  14. Reengineering Hanford

    Energy Technology Data Exchange (ETDEWEB)

    Badalamente, R.V.; Carson, M.L.; Rhoads, R.E.

    1995-03-01

    The Department of Energy Richland Operations Office is in the process of reengineering its Hanford Site operations. There is a need to fundamentally rethink and redesign environmental restoration and waste management processes to achieve dramatic improvements in the quality, cost-effectiveness, and timeliness of the environmental services and products that make cleanup possible. Hanford is facing the challenge of reengineering in a complex environment in which major processes cuts across multiple government and contractor organizations and a variety of stakeholders and regulators have a great influence on cleanup activities. By doing the upfront work necessary to allow effective reengineering, Hanford is increasing the probability of its success.

  15. Evolving software products, the design of a water-related modeling software ecosystem

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2017-01-01

    more than 50 years ago. However, a radical change of software products to evolve both in the software engineering as much as the organizational and business aspects in a disruptive manner are rather rare. In this paper, we report on the transformation of one of the market leader product series in water......-related calculation and modeling from a traditional business-as-usual series of products to an evolutionary software ecosystem. We do so by relying on existing concepts on software ecosystem analysis to analyze the future ecosystem. We report and elaborate on the main focus points necessary for this transition. We...... argue for the generalization of our focus points to the transition from traditional business-as-usual software products to software ecosystems....

  16. REENGINEERING PROSES BISNIS : TINJAUAN KONSEPTUAL DAN METODOLOGI

    Directory of Open Access Journals (Sweden)

    Lena Ellitan

    1999-01-01

    Full Text Available Business process reengineering is fundamental rethinking and radical redesign of an organization's business processes that leads the organization to achieve dramatic improvement in business performance. Many firms have successfully embraced this new innovation paradigm to achieve orders of magnitude improvements in cost, eficiency, quality and value. Even more firms are seeking opportunities to apply reengineering and methodologies to assist them in doing so. The recognition of reengineering as a new management paradigm emerged in the 1990's, though it may be argued that the principle of reengineering has been applied well before then. The early 1990's saw world wide interest in reengineering. Consequently, many organizations have reported their first-cycle experiences in reengineering. Reengineering practices in the period of the 1990's was largerly characterized by application to operational processes and emphasis on operational measure of time, cost, and quality. Quite recently, a more strategic flavor of reengineering has been advocated. One of the hopes of new thinking is that by trancending microscopic concern of operational strategy, it would help the organization derive significantly greater value out of the reengineering effort. This paper presents: 1. The concept of reengineering 2. Various problems in business process reengineering. 3. The rigorous methodology for organizing reengineering activities. Abstract in Bahasa Indonesia : Reengineering proses bisnis adalah pemikiran ulang fundamental dan disain ulang radikal suatu proses bisnis organisasi yang akan mengarahkan organisasi untuk mencapai peningkatan kinerja bisnis secara dramatis. Beberapa perusahaan telah menerapkan paradigma inovasi baru ini untuk mencapai berbagai perbaikan dalam biaya, kualitas, dan efisiensi. Bahkan makin banyak perusahaan yang mencari peluang untuk menerapkan proyek reengineering dan metodologi-metodologi yang membantu mereka dalam mencapai usaha

  17. IDC Reengineering Phase 2 Project Scope.

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report provides a brief description of the scope of the IDC Reengineering Phase 2 project. It describes the goals and objectives of reengineering, the system definition, and the technical scope of the system. REVISIONS Version Date Author/Team Revision Description Authorized by 1.0 9/25/2014 SNL IDC Reengineering Team Unlimited Release for I2 M. Harris 1.1 28/01/2015 IDC Reengineering Team Align with previous IDC scope document E. Tomuta.

  18. The Topical Problems of Reengineering of Production Enterprises

    Directory of Open Access Journals (Sweden)

    Chumak Larysa F.

    2018-01-01

    Full Text Available The article is aimed at researching the problems of reengineering of industrial enterprises and determining the efficient ways of their solution. The essence of process of reengineering, conditions and expediency of carrying out reengineering were researched. Elements of the system of principles of reengineering, its stages, problems of implementation, and typical errors arising during reengineering, have been defined. It has been determined that reengineering should be closely connected with the strategies of industrial enterprise in order to achieve maximum efficiency of the enterprise’s activity and to prevent additional costs. Reengineering of an industrial enterprise should be supported by an appropriate organizational structure, sound information technology, and contemporary strategic considerations.

  19. Software SCMS re-engineering for a objected oriented language (JAVA) for use in construction of segmented phantoms

    International Nuclear Information System (INIS)

    Possani, Rafael Guedes

    2012-01-01

    Recent treatment planning systems depend strongly on CT images and the tendency is that the internal dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) and computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET and SPECT. This information associated with a radiation transport simulation software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand the complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran-77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  20. Reengineering of Analytical Data Management for the Environmental Restoration Project at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Bolivar, S.; Dorries, A.; Nasser, K.; Scherma, S.

    2003-01-01

    The Environmental Restoration (ER) Project at Los Alamos National Laboratory (LANL) is responsible for the characterization, clean up, and monitoring of over 2,124 identified potential release sites (PRS). These PRSs have resulted from operations associated with weapons and energy related research which has been conducted at LANL since 1942. To accomplish mission goals, the ER Project conducts field sampling to determine possible types and levels of chemical contamination as well as their geographic extent. Last fiscal year, approximately 4000 samples were collected during ER Project field sampling campaigns. In the past, activities associated with field sampling such as sample campaign planning, paperwork, shipping and analytical laboratory tracking; verification and order fulfillment; validation and data quality assurance were performed by multiple groups working with a variety of software applications, databases and hard copy reports. This resulted in significant management and communication difficulties, data delivery delays, and inconsistent processes; it also represented a potential threat to overall data integrity. Creation of an organization, software applications and a data process that could provide for cost-effective management of the activities and data mentioned above became a management priority, resulting in a development of a reengineering task. This reengineering effort--currently nearing completion--has resulted in personnel reorganization, the development of a centralized data repository, and a powerful web-based sample management system that allows for an appreciably streamlined and more efficient data process. These changes have collectively cut data delivery times, allowed for larger volumes of samples and data to be handled with fewer personnel, and resulted in significant cost savings. This paper will provide a case study of the reengineering effort undertaken by the ER Project of its analytical data management process. It includes

  1. Reengineering of Analytical Data Management for the Environmental Restoration Project at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Bolivar, S.; Dorries, A.; Nasser, K.; Scherma, S.

    2003-02-27

    The Environmental Restoration (ER) Project at Los Alamos National Laboratory (LANL) is responsible for the characterization, clean up, and monitoring of over 2,124 identified potential release sites (PRS). These PRSs have resulted from operations associated with weapons and energy related research which has been conducted at LANL since 1942. To accomplish mission goals, the ER Project conducts field sampling to determine possible types and levels of chemical contamination as well as their geographic extent. Last fiscal year, approximately 4000 samples were collected during ER Project field sampling campaigns. In the past, activities associated with field sampling such as sample campaign planning, paperwork, shipping and analytical laboratory tracking; verification and order fulfillment; validation and data quality assurance were performed by multiple groups working with a variety of software applications, databases and hard copy reports. This resulted in significant management and communication difficulties, data delivery delays, and inconsistent processes; it also represented a potential threat to overall data integrity. Creation of an organization, software applications and a data process that could provide for cost-effective management of the activities and data mentioned above became a management priority, resulting in a development of a reengineering task. This reengineering effort--currently nearing completion--has resulted in personnel reorganization, the development of a centralized data repository, and a powerful web-based sample management system that allows for an appreciably streamlined and more efficient data process. These changes have collectively cut data delivery times, allowed for larger volumes of samples and data to be handled with fewer personnel, and resulted in significant cost savings. This paper will provide a case study of the reengineering effort undertaken by the ER Project of its analytical data management process. It includes

  2. The Reengineering of Processes a Tool in the Administration of Business: Case Cereales "Santiago"

    Directory of Open Access Journals (Sweden)

    Roberto René Moreno-García

    2015-12-01

    Full Text Available The article presents the research result on the application of the Reengineering of processes in the company Cereales Santiago and the introduction of the information sciences through the PesajeVoz software. In the research it is characterized the main deficiencies of the strategic process of commercialization that affect the economic result of the company and the satisfaction of their clients, by the losses and delays when receiving their raw materials. A study is also realized on the evolution of the reengineering of processes concept from its initial formulation and a characterization of some of the methodologies for its application, reference is made to an own methodology generic for the application of the reengineering of processes in the Cuban system of companies, that have been validated it in the company study object, allowed obtaining of results of impacts in quantitative and qualitative benefits for the company and its clients. 

  3. Environmental management compliance reengineering project, FY 1997 report

    International Nuclear Information System (INIS)

    VanVliet, J.A.; Davis, J.N.

    1997-09-01

    Through an integrated reengineering effort, the Idaho National Engineering and Environmental Laboratory (INEEL) is successfully implementing process improvements that will permit safe and compliant operations to continue during the next 5 years, even though $80 million was removed from the Environmental Management (EM) program budget. A 2-year analysis, design, and implementation project will reengineer compliance-related activities and reduce operating costs by approximately $17 million per year from Fiscal Year (FY) 1998 through 2002, while continuing to meet the INEEL''s environment, safety, and health requirements and milestone commitments. Compliance reengineer''s focus is improving processes, not avoiding full compliance with environmental, safety, and health laws. In FY 1997, compliance reengineering used a three-phase approach to analyze, design, and implement the changes that would decrease operating costs. Implementation for seven specific improvement projects was completed in FY 1997, while five projects will complete implementation in FY 1998. During FY 1998, the three-phase process will be repeated to continue reengineering the INEEL

  4. Environmental management compliance reengineering project, FY 1997 report

    Energy Technology Data Exchange (ETDEWEB)

    VanVliet, J.A.; Davis, J.N.

    1997-09-01

    Through an integrated reengineering effort, the Idaho National Engineering and Environmental Laboratory (INEEL) is successfully implementing process improvements that will permit safe and compliant operations to continue during the next 5 years, even though $80 million was removed from the Environmental Management (EM) program budget. A 2-year analysis, design, and implementation project will reengineer compliance-related activities and reduce operating costs by approximately $17 million per year from Fiscal Year (FY) 1998 through 2002, while continuing to meet the INEEL`s environment, safety, and health requirements and milestone commitments. Compliance reengineer`s focus is improving processes, not avoiding full compliance with environmental, safety, and health laws. In FY 1997, compliance reengineering used a three-phase approach to analyze, design, and implement the changes that would decrease operating costs. Implementation for seven specific improvement projects was completed in FY 1997, while five projects will complete implementation in FY 1998. During FY 1998, the three-phase process will be repeated to continue reengineering the INEEL.

  5. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  6. Reengineering in Australia: factors affecting success

    Directory of Open Access Journals (Sweden)

    Felicity Murphy

    1998-11-01

    Full Text Available Business process reengineering (BPR is being used in many organisations worldwide to realign operations. Most of the research undertaken has been focused on North American or European practices. The study reported here replicates a US reengineering study in an Australian context by surveying large public and private sector Australian organisations. The study makes three main contributions by: (1 presenting a picture of BPR practices in Australia, (2 clarifying factors critical to the success of reengineering projects in Australia, and (3 providing a comparison of factors leading to success in Australian BPR projects with those found in the US.

  7. Evolving impact of Ada on a production software environment

    Science.gov (United States)

    Mcgarry, F.; Esker, L.; Quimby, K.

    1988-01-01

    Many aspects of software development with Ada have evolved as our Ada development environment has matured and personnel have become more experienced in the use of Ada. The Software Engineering Laboratory (SEL) has seen differences in the areas of cost, reliability, reuse, size, and use of Ada features. A first Ada project can be expected to cost about 30 percent more than an equivalent FORTRAN project. However, the SEL has observed significant improvements over time as a development environment progresses to second and third uses of Ada. The reliability of Ada projects is initially similar to what is expected in a mature FORTRAN environment. However, with time, one can expect to gain improvements as experience with the language increases. Reuse is one of the most promising aspects of Ada. The proportion of reusable Ada software on our Ada projects exceeds the proportion of reusable FORTRAN software on our FORTRAN projects. This result was noted fairly early in our Ada projects, and experience shows an increasing trend over time.

  8. CONCEPT OF REENGINEERING AGAIN RETURNS IN ACTUALITY

    Directory of Open Access Journals (Sweden)

    Vasile Ionel POPESCU

    2014-06-01

    Full Text Available Although it was released in the summer of 1990, the concept of reengineering returns in actuality, because in the social and economic conditions that we are experiencing at the moment, to face the increasingly fierce competition more and more companies have to resort to redesign the processes. Throughout this article, after a brief introduction, we will present the factors that contributed to the occurrence of reengineering; trying to highlight what involves this concept, the characteristics of the processes resulted from the reengineering, the importance and methods to prepare a process map, and the method to launch the process redesign. Finally we have issued several opinions and have made a number of recommendations that will lead to achieving a qualitative leap targeted by the companies which resort to reengineering.

  9. Customer configuration updating in a software supply network

    NARCIS (Netherlands)

    Jansen, S.R.L.

    2007-01-01

    Product software development is the activity of development, modification, reuse, re-engineering, maintenance, or any other activities that result in packaged configurations of software components or software-based services that are released for and traded in a specific market \\cite{XuBrinkkemper}.

  10. Leadership processes for re-engineering changes to the health care industry.

    Science.gov (United States)

    Guo, Kristina L

    2004-01-01

    As health care organizations seek innovative ways to change financing and delivery mechanisms due to escalated health care costs and increased competition, drastic changes are being sought in the form of re-engineering. This study discusses the leader's role of re-engineering in health care. It specifically addresses the reasons for failures in re-engineering and argues that success depends on senior level leaders playing a critical role. Existing studies lack comprehensiveness in establishing models of re-engineering and management guidelines. This research focuses on integrating re-engineering and leadership processes in health care by creating a step-by-step model. Particularly, it illustrates the four Es: Examination, Establishment, Execution and Evaluation, as a comprehensive re-engineering process that combines managerial roles and activities to result in successfully changed and reengineered health care organizations.

  11. Business process re-engineering in service operations

    International Nuclear Information System (INIS)

    McClintock, J.W.

    1995-01-01

    The concept of business process re-engineering, and how it was applied to the operations of the Consumers Gas Company were discussed. Business process re-engineering was defined as the improvement of the efficiency of the customer-service process, and the overall improvement of practices and operations. The re-engineering project was said to involve a thorough analysis of information technology, current limitations, and business operational needs, undertaken on an enterprise-wide basis. Viewed generically,a re-engineering project was said to have six major components: (1) business drivers (i.e. the articulation of the Company's strategic issues); (2) benchmark measures; (3) future state process models; (4) cost/benefit analysis; (5) a change management plan; and (6) a development plan. Business improvements expected to result from the project include reduced cost of operation, reduction of waste, and a substantially complete re-design of the business process. Management of the project involved a team approach, and help of a consultant to identify the scope of the re-design, its limitations, and future state. A life expectancy of approximately 10 years was given for the re-engineering plan, with annual benefits (in terms of cost reduction) of $4.6 million by the year 2000

  12. Reengineering a cardiovascular surgery service.

    Science.gov (United States)

    Tunick, P A; Etkin, S; Horrocks, A; Jeglinski, G; Kelly, J; Sutton, P

    1997-04-01

    Reengineering, involving the radical redesign of business processes, has been used successfully in a variety of health care settings. In 1994 New York University (NYU) Medical Center (MC) launched its first reengineering team, whose purpose was to redesign the entire process of caring for patients-from referral to discharge-on the cardiovascular (CV) surgery service. REENIGINEERING TEAM: The multidisciplinary CV Surgery Reengineering Team was charged with two goals: improving customer (patient, family, and referring physician) satisfaction and improving profitability. The methodology to be used was based on a reengineering philosophy-discarding basic assumptions and designing the patient care process from the ground up. THE TRANSFER-IN INITIATIVE: A survey of NYU cardiologists, distributed in April 1994, suggested that the organization was considered a difficult place to transfer patients. The team's recommendations led to a new, streamlined transfer-in policy. The average waiting time from when a referring physician requested a patient transfer and the time when an NYUMC physician accepted the transfer decreased from an average of 9 hours under the old system to immediate acceptance. Three customer satisfaction task forces implemented multiple programs to make the service more user friendly. In addition, referrals increased and length of stay decreased, without an adverse impact on the mortality rate. For the first time at NYUMC, a multidisciplinary team was given the mandate to achieve major changes in an entire patient care process. Similar projects are now underway.

  13. Defense programs business practices re-engineering QFD exercise

    International Nuclear Information System (INIS)

    Murray, C.; Halbleib, L.

    1996-03-01

    The end of the cold war has resulted in many changes for the Nuclear Weapons Complex (NWC). We now work in a smaller complex, with reduced resources, a smaller stockpile, and no new phase 3 weapons development programs. This new environment demands that we re-evaluate the way we design and produce nuclear weapons. The Defense Program (DP) Business Practices Re-engineering activity was initiated to improve the design and production efficiency of the DP Sector. The activity had six goals: (1) to identify DP business practices that are exercised by the Product Realization Process (PRP); (2) to determine the impact (positive, negative, or none) of these practices on defined, prioritized customer criteria; (3) to identify business practices that are candidates for elimination or re-engineering; (4) to select two or three business practices for re-engineering; (5) to re-engineer the selected business practices; and (6) to exercise the re-engineered practices on three pilot development projects. Business practices include technical and well as administrative procedures that are exercised by the PRP. A QFD exercise was performed to address (1)-(4). The customer that identified, defined, and prioritized the criteria to rate the business practices was the Block Change Advisory Group. Five criteria were identified: cycle time, flexibility, cost, product performance/quality, and best practices. Forty-nine business practices were identified and rated per the criteria. From this analysis, the group made preliminary recommendations as to which practices would be addressed in the re-engineering activity. Sixteen practices will be addressed in the re-engineering activity. These practices will then be piloted on three projects: (1) the Electronic Component Assembly (ECA)/Radar Project, (2) the B61 Mod 11, and (3) Warhead Protection Program (WPP)

  14. Reengineering GSM/GPRS Towards a Dedicated Network for Massive Smart Metering

    DEFF Research Database (Denmark)

    Madueño, Germán Corrales; Stefanovic, Cedomir; Popovski, Petar

    2014-01-01

    GSM is a synonym for a major success in wireless technology, achieving widespread use and high technology maturity. However, its future is questionable, as many stakeholders indicate that the GSM spectrum should be re-farmed for LTE. On the other hand, the advent of smart grid and the ubiquity...... of smart meters will require reliable, long-lived wide area connections. This motivates to investigate the potential of GSM to be evolved into a dedicated network for smart metering. We introduce simple mechanisms to reengineer the access control in GSM. The result is a system that offers excellent support...

  15. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  16. Analytical Design of Evolvable Software for High-Assurance Computing

    Science.gov (United States)

    2001-02-14

    system size Sext wij j 1= Ai ∑ wik k 1= Mi ∑+               i 1= N ∑= = 59 5 Analytical Partition of Components As discussed in Chapter 1...76]. Does the research approach yield evolvable components in less mathematically-oriented applications such as multi- media and e- commerce? There is... Social Security Number Date 216 217 Appendix H Benchmark Design for the Microwave Oven Software The benchmark design consists of the

  17. Goal-Equivalent Secure Business Process Re-engineering

    DEFF Research Database (Denmark)

    Acosta, Hugo Andrés Lópes; Massacci, Fabio; Zannone, Nicola

    2008-01-01

    that they are somehow “equivalent”. In this paper we propose a method for passing from SI*, a modeling language for capturing and modeling functional, security, and trust organizational and system requirements, to business process specifications and vice versa. In particular, starting from an old secure business......The introduction of information technologies in health care systems often requires to re-engineer the business processes used to deliver care. Obviously, the new and re-engineered processes are observationally different and thus we cannot use existing model-based techniques to argue...... process, we reconstruct the functional and security requirements at organizational level that such a business process was supposed to meet (including the trust relations that existed among the members of the organization). To ensure that the re-engineered business process meets the elicited requirements...

  18. Re-engineering production systems: the Royal Netherlands Naval Dockyard

    NARCIS (Netherlands)

    Zijm, Willem H.M.

    1996-01-01

    Reengineering production systems in an attempt to meet tight cost, quality and leadtime standards has received considerable attention in the last decade. In this paper, we discuss the reengineering process at the Royal Netherlands Naval Dockyard. The process starts with a characterisation and a

  19. Re-engineering pre-employment check-up systems: a model for improving health services.

    Science.gov (United States)

    Rateb, Said Abdel Hakim; El Nouman, Azza Abdel Razek; Rateb, Moshira Abdel Hakim; Asar, Mohamed Naguib; El Amin, Ayman Mohammed; Gad, Saad abdel Aziz; Mohamed, Mohamed Salah Eldin

    2011-01-01

    The purpose of this paper is to develop a model for improving health services provided by the pre-employment medical fitness check-up system affiliated to Egypt's Health Insurance Organization (HIO). Operations research, notably system re-engineering, is used in six randomly selected centers and findings before and after re-engineering are compared. The re-engineering model follows a systems approach, focusing on three areas: structure, process and outcome. The model is based on six main components: electronic booking, standardized check-up processes, protected medical documents, advanced archiving through an electronic content management (ECM) system, infrastructure development, and capacity building. The model originates mainly from customer needs and expectations. The centers' monthly customer flow increased significantly after re-engineering. The mean time spent per customer cycle improved after re-engineering--18.3 +/- 5.5 minutes as compared to 48.8 +/- 14.5 minutes before. Appointment delay was also significantly decreased from an average 18 to 6.2 days. Both beneficiaries and service providers were significantly more satisfied with the services after re-engineering. The model proves that re-engineering program costs are exceeded by increased revenue. Re-engineering in this study involved multiple structure and process elements. The literature review did not reveal similar re-engineering healthcare packages. Therefore, each element was compared separately. This model is highly recommended for improving service effectiveness and efficiency. This research is the first in Egypt to apply the re-engineering approach to public health systems. Developing user-friendly models for service improvement is an added value.

  20. A Process Re-engineering Framework for Reverse Logistics based on a Case Study

    Directory of Open Access Journals (Sweden)

    Hing Kai Chan

    2010-09-01

    Full Text Available Reverse logistics has gained increasing attention in recent years as a channel for companies to achieve operational excellence. The process involves manipulation of returned materials, or even products, which forms a pivotal role in sustainable development throughout the whole supply chains. To make reverse logistics possible, process re-engineering may need to be carried out. However, the processes involved in reengineering are practically complicated. Objectives, benefits, and applicability of any process re-engineering require a careful and detailed strategic planning. This paper aims to propose an easy-to-follow step-by-step framework for practitioners to perform process re-engineering, to learn and identify the critical issues in each step, and to be successful in applying process re-engineering in order to enhance reverse logistics performance. A learner-centred approach is adopted based on a case study of process re-engineering, which is demonstrated in the paper for explanation.

  1. Innovative model of business process reengineering at machine building enterprises

    Science.gov (United States)

    Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.

    2017-10-01

    The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.

  2. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  3. ORGANIZATIONAL CHANGE: BUSINESS PROCESS REENGINEERING OR OUTSOURCING?

    Directory of Open Access Journals (Sweden)

    Pellicelli Michela

    2012-12-01

    Full Text Available This article will analyze the logic behind the adoption of Business Process Reengineering and outsourcing. The first part analyzes Business Process Reengineering as a technique for analysis and for defining the business processes implemented by organizations in order to make the achievement of corporate objectives more efficient and effective. Nevertheless, this approach has some limits when the reengineering project aims solely at cost reduction. In any event, for several activities management must constantly evaluate the alternative to turning to outsourcing. In the second part we thus observe what should be the evaluations of management in order to pursue the objectives of maximum efficiency, economic efficiency, and productivity. Starting from the methodological assumptions that aid our understanding of the outsourcing of processes and that represent the operational and conceptual framework for the existence of this approach, several models will be analyzed held to be significant for determining those processes that can be outsourced, from a “strategic” point of view, and that are useful for deciding on the shift from BPR to outsourcing.

  4. BUSINESS PROCESS REENGINEERING

    Directory of Open Access Journals (Sweden)

    Magdalena LUCA (DEDIU

    2014-06-01

    Full Text Available Business process reengineering determines the change of organizational functions from an orientation focused on operations through a multidimensional approach. Former employees who were mere executors are now determined to take their own decisions and as a result the functional departments lose their reason to exist. Managers do not act anymore as supervisors, but mainly as mentors, while the employees focus more attention on customer needs and less than the head’s. Under these conditions, new organizational paradigms are required, the most important being that of learning organizations. In order to implement a reengineering of the economic processes and promoting a new organizational paradigm the information technology plays a decisive role. The article presents some results obtained in a research theme ANSTI funded by contract no. 501/2000. Economic and financial analysis is performed in order to know the current situation to achieve better results in the future. One of its objectives is the production analyzed as a labour process and the interaction elements of this process. The indicators investigated in the analysis of financial and economic activity of production reflect the development directions, the means and resources to accomplish predetermined objectives and express the results and effectiveness of what is expected.

  5. Re-Engineering Marketing (RM

    Directory of Open Access Journals (Sweden)

    Bozhidar Iv. Hadzhiev

    2010-12-01

    Full Text Available La globalización, el auge de la economía, el progreso de la e-net economía, y el gran dinamismo de las relaciones comerciales se están constituyendo como una función progresiva en constante crecimiento, predeterminando la utilización de unas pocas nuevas oportunidades para aumentar la eficacia de las empresas. Llegados a este punto, a través del prisma de los métodos de reingeniería en el presente artículo se muestran algunos problemas básicos y las oportunidades existentes para la Reingeniería del Marketing (RM.Globalization, the rise of the economy, the progress of the e-net economy, and the high dynamics of business relationships are developing as one of the permanently rising progressive functions, predetermining the use of a few new opportunities for increasing effectiveness of the industry companies. At this point, through the prism of Re-engineering methods, a few basic problems and opportunities for Re-engineering Marketing (RM are presented in this paper.

  6. The development and technology transfer of software engineering technology at NASA. Johnson Space Center

    Science.gov (United States)

    Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.

    1992-01-01

    The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.

  7. Software Technology for E-Commerce Era

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The rapid growth of Internet usage and electronic commerce(e-commerce) applica t ions will push traditional industries to transform their business models and to re-engineer their information systems. This direction will give the software in d ustry either great opportunities for their business growth or crucial challenges to their existence. This article describes two essential challenges the softwar e industry will face and presents relevant new technologies that will be helpful for overcoming those challenges.

  8. Simulation software: engineer processes before reengineering.

    Science.gov (United States)

    Lepley, C J

    2001-01-01

    People make decisions all the time using intuition. But what happens when you are asked: "Are you sure your predictions are accurate? How much will a mistake cost? What are the risks associated with this change?" Once a new process is engineered, it is difficult to analyze what would have been different if other options had been chosen. Simulating a process can help senior clinical officers solve complex patient flow problems and avoid wasted efforts. Simulation software can give you the data you need to make decisions. The author introduces concepts, methodologies, and applications of computer aided simulation to illustrate their use in making decisions to improve workflow design.

  9. Reengineering the Project Design Process

    Science.gov (United States)

    Casani, E.; Metzger, R.

    1994-01-01

    In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.

  10. Application of information and communication technology in process reengineering

    Directory of Open Access Journals (Sweden)

    Đurović Aleksandar M.

    2014-01-01

    Full Text Available This paper examines the role of information communication technologies in reengineering processes. General analysis of a process will show that information communication technologies improve their efficiency. Reengineering model based on the BPMN 2.0 standard will be applied to the process of seeking internship/job by students from Faculty of Transport and Traffic Engineering. In the paper, after defining the technical characteristics and required functionalities, web / mobile application is proposed, enabling better visibility of traffic engineers to companies seeking that education profile.

  11. IDC Re-Engineering Phase 3 Development Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, John F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pollock, David L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    Sandia National Laboratories has prepared a project development plan that proposes how the parties interested in the IDC Re-Engineering system will coordinate its development, testing and transition to operations.

  12. IDC Re-Engineering Phase 3 Development Plan

    International Nuclear Information System (INIS)

    Harris, James M.; Burns, John F.; Pollock, David L.

    2017-01-01

    Sandia National Laboratories has prepared a project development plan that proposes how the parties interested in the IDC Re-Engineering system will coordinate its development, testing and transition to operations.

  13. Reengineering health care materials management.

    Science.gov (United States)

    Connor, L R

    1998-01-01

    Health care executives across the country, faced with intense competition, are being forced to consider drastic cost cutting measures as a matter of survival. The entire health care industry is under siege from boards of directors, management and others who encourage health care systems to take actions ranging from strategic acquisitions and mergers to simple "downsizing" or "rightsizing," to improve their perceived competitive positions in terms of costs, revenues and market share. In some cases, management is poorly prepared to work within this new competitive paradigm and turns to consultants who promise that following their methodologies can result in competitive advantage. One favored methodology is reengineering. Frequently, cost cutting attention is focused on the materials management budget because it is relatively large and is viewed as being comprised mostly of controllable expenses. Also, materials management is seldom considered a core competency for the health care system and the organization performing these activities does not occupy a strongly defensible position. This paper focuses on the application of a reengineering methodology to healthcare materials management.

  14. IDC Re-Engineering Phase 2 Architecture Document.

    Energy Technology Data Exchange (ETDEWEB)

    Burns, John F.

    2015-12-01

    This document contains a description of the system architecture for the IDC Re-Engineering Phase 2 project. This is a draft version that primarily provides background information for understanding delivered Use Case Realizations.

  15. Distance Measures for Information System Reengineering

    NARCIS (Netherlands)

    Poels, G.; Viaene, S.; Dedene, G.; Wangler, B.; Bergman, L.

    2000-01-01

    We present an approach to assess the magnitude and impact of information system reengineering caused by business process change. This approach is based on two concepts: object-oriented business modeling and distance measurement. The former concept is used to visualize changes in the business layer

  16. Architecture-driven Migration of Legacy Systems to Cloud-enabled Software

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Babar, Muhammad Ali

    2014-01-01

    of legacy systems to cloud computing. The framework leverages the software reengineering concepts that aim to recover the architecture from legacy source code. Then the framework exploits the software evolution concepts to support architecture-driven migration of legacy systems to cloud-based architectures....... The Legacy-to-Cloud Migration Horseshoe comprises of four processes: (i) architecture migration planning, (ii) architecture recovery and consistency, (iii) architecture transformation and (iv) architecture-based development of cloud-enabled software. We aim to discover, document and apply the migration...

  17. The Organizational-Economic Provision of Reengineering of Marketing Activity of Ukrainian Machine-Building Enterprises

    Directory of Open Access Journals (Sweden)

    Kobyzskyi Denys S.

    2018-02-01

    Full Text Available The article is aimed at developing an organizational mechanism to provide reengineering of the marketing activities of machine-building enterprise for further development of the appropriate methodical recommendations. The meaning and role of organizational structure in the sphere of reengineering are disclosed, the key aspects and principles of its construction are defined; the key elements, in particular business processes, and their role in organizational structure as well as properties of the organizational system are researched; content of the basic components of the organizational mechanism of the provision, their role and peculiarities of communication between them are analyzed. The new attitude to the principles of construction, functional content and content of the constituents of organization of enterprises allows to realize the wide functional potential of organizational possibilities within the terms of reengineering, as well as to form an organizational mechanism of post-reengineering company. Certain aspects of development of the organizational mechanism create the preconditions and disclose a potential instrumentarium for effective and efficient methodical recommendations as to reengineering of marketing activities of Ukrainian machine-building enterprises.

  18. THE PROJECT MANAGEMENT OF INDUSTRIAL BUILDINGS REENGINEERING (RECONSTRUCTION AND COMPLETION

    Directory of Open Access Journals (Sweden)

    K. Kolesnikova

    2017-06-01

    Full Text Available Creative element fate of any activity may not fall to zero because of the turbulent environment in which these activities are carried out, always prevents this. Environment that makes each building unique, that is, provides the main basis of the project. When we are talking about complex construction, the share of the creative component becomes very significant. First and foremost, this is explained by the duration of the construction work, during which time to happen risk events. The article analyses the processes of construction from the point of view of their conformity to the concept of project activities. It is shown that with increasing degree of difficulty of construction or time of the last share of creative activities in the overall project grows. In recent times more and more widespread work on re-engineering complex systems, for example, building constructions. This means repair of the building, but not a simple repair with restoration of the original, incorporated in the design of building elements and their interfaces, and partial or full replacement of items that fail or are outdated, new ones require first, a new design of their structures and production technologies, as well as the design of the accessories for the installation and technology is reshaping the object. Combining the two above-mentioned factors of growth of the share of creative activities during the project management of the re-engineering of building structures: complexity and construction time, received a cognitive model of such growth. Introduced the concept of "reengineering in construction" as a combination of the processes of adjustment and worn or completion of unfinished buildings. It is proved that any re-engineering in construction is the project activities. Provisions are tested in a real reengineering of industrial buildings with a positive technical and economic effect.

  19. THE PROJECT MANAGEMENT OF INDUSTRIAL BUILDINGS REENGINEERING (RECONSTRUCTION AND COMPLETION

    Directory of Open Access Journals (Sweden)

    Katerina Kolesikova

    2017-05-01

    Full Text Available Creative element fate of any activity may not fall to zero because of the turbulent environment in which these activities are carried out, always prevents this. Environment that makes each building unique, that is, provides the main basis of the project. When we are talking about complex construction, the share of the creative component becomes very significant. First and foremost, this is explained by the duration of the construction work, during which time to happen risk events. The article analyses the processes of construction from the point of view of their conformity to the concept of project activities. It is shown that with increasing degree of difficulty of construction or time of the last share of creative activities in the overall project grows. In recent times more and more widespread work on re-engineering complex systems, for example, building constructions. This means repair of the building, but not a simple repair with restoration of the original, incorporated in the design of building elements and their interfaces, and partial or full replacement of items that fail or are outdated, new ones require first, a new design of their structures and production technologies, as well as the design of the accessories for the installation and technology is reshaping the object. Combining the two above-mentioned factors of growth of the share of creative activities during the project management of the re-engineering of building structures: complexity and construction time, received a cognitive model of such growth. Introduced the concept of "reengineering in construction" as a combination of the processes of adjustment and worn or completion of unfinished buildings. It is proved that any re-engineering in construction is the project activities. Provisions are tested in a real reengineering of industrial buildings with a positive technical and economic effect.

  20. ROMANIAN COMPANIES DILEMMAS - BUSINESS REENGINEERING OR KAIZEN

    Directory of Open Access Journals (Sweden)

    MIHAELA GHICAJANU

    2011-01-01

    Full Text Available This paper presents an analysis of two American and Japanese management strategies, the reengineering and Kaizen strategies, which can be used successfully by the Romanian companies, too. Reengineering is the fundamental rethinking and radical redesign of business processes to achieve dramatic improvements in critical contemporary measures of performance such as cost, quality, service and speed. Kaizen is a philosophy of life that addresses to the best who want to be more and better. It is a process of improvement that never ends and it results in many advantages. The Japanese leadership model has shown that progress in small steps, but fast, reliable and leads to long-term wins. Kaizen method, implemented in Romania, too, has brought to people satisfaction and more money in their pocket.

  1. Impact of peculiar features of construction of transport infrastructure on the choice of tools for reengineering of business processes

    Science.gov (United States)

    Khripko, Elena

    2017-10-01

    In the present article we study the issues of organizational resistance to reengineering of business processes in construction of transport infrastructure. Reengineering in a company of transport sector is, first and foremost, an innovative component of business strategy. We analyze the choice of forward and reverse reengineering tools and terms of their application in connection with organizational resistance. Reengineering is defined taking into account four aspects: fundamentality, radicality, abruptness, business process. We describe the stages of reengineering and analyze key requirements to newly created business processes.

  2. Reengineering the project design process

    Science.gov (United States)

    Kane Casani, E.; Metzger, Robert M.

    1995-01-01

    In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.

  3. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    Science.gov (United States)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such

  4. IDC Reengineering Phase 2 & 3 Project Scope

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Prescott, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-10-01

    Sandia National Laboratories has prepared a cost estimate budgetary planning for the IDC Reengineering Phase 2 & 3 effort. This report provides the cost estimate and describes the methodology, assumptions, and cost model details used to create the cost estimate.

  5. A Framework for Process Reengineering in Higher Education: A case study of distance learning exam scheduling and distribution

    Directory of Open Access Journals (Sweden)

    M'hammed Abdous

    2008-10-01

    Full Text Available In this paper, we propose a conceptual and operational framework for process reengineering (PR in higher education (HE institutions. Using a case study aimed at streamlining exam scheduling and distribution in a distance learning (DL unit, we outline a sequential and non-linear four-step framework designed to reengineer processes. The first two steps of this framework – initiating and analyzing – are used to initiate, document, and flowchart the process targeted for reengineering, and the last two steps – reengineering/ implementing and evaluating – are intended to prototype, implement, and evaluate the reengineered process. Our early involvement of all stakeholders, and our in-depth analysis and documentation of the existing process, allowed us to avoid the traditional pitfalls associated with business process reengineering (BPR. Consequently, the outcome of our case study indicates a streamlined and efficient process with a higher faculty satisfaction at substantial cost reduction.

  6. A Longitudinal BPR Study in a Danish Manufacturing Company - From Reengineering to Process Management

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Wieth, Christian; Domsten, Zenia Vittarp

    1998-01-01

    , Business Process Reengineering (BPR) is the most applied method for planning and carrying out projects. Novo Nordisk A/S is one of the largest companies in Denmark and the world's largest producer of industrial enzymes with a market share of more than 50%.This paper is a longitudinal study of BPR...... initiatives at Enzyme Business carried out within the time frame of January 1994 to March 1998. The paper provides empirical insight from a number of BPR-projects and related BPR-initiatives, e.g. Business System Reengineering projects. The results of the paper suggest that reengineering with the means...

  7. Re-engineering change in higher education

    Directory of Open Access Journals (Sweden)

    David Allen

    1999-01-01

    Full Text Available Business Process Re-engineering (BPR is being used in a number of UK Higher Education Institutions (HEIs as a change management strategy. Whilst the focus of these HEIs is on re-engineering administrative services, there are also tentative attempts to redesign teaching and learning. This paper adopts a case study approach to determine the applicability of BPR to HEIs. The research started from a broad research question: How does organisational culture in HEIs impact on the implementation of BPR programmes? The conclusions drawn from the research are that the organisational culture and structure of HEIs limit the degree of change sought from a BPR project: the focus of the case study HEIs was on incremental process improvement of administrative services. The projects in these institutions were not about radical change. BPR techniques are shown to have something to offer HEIs in terms of co-ordinating administrative activities, but the emphasis on IT and processes in project design means the human resources change necessary for significant gains in efficiency is unlikely.

  8. Technologies and problems of reengineering of the business processes of company

    Science.gov (United States)

    Silka, Dmitriy

    2017-10-01

    Management of the combination of business processes is a modern approach in the field of business management. Together with a lot of management approaches business processes allow us to identify all the resultant actions. Article reveals the modern view on the essence of business processes as well as the general approaches of their allocation. Principles of construction and business process re-engineering are proposed. Recommendations on how to perform re-engineering under high cyclic dynamics of business activity are provided.

  9. Managing hospital supplies: process reengineering at Gujarat Cancer Research Institute, India.

    Science.gov (United States)

    Ramani, K V

    2006-01-01

    Aims to give an overview of the re-engineering of processes and structures at Gujarat Cancer Research Institute (GCRI), Ahmedabad. A general review of the design, development and implementation of reengineered systems in order to address concerns about the existing systems. Findings GCRI is a comprehensive cancer care center with 550 beds and well equipped with modern diagnostic and treatment facilities. It serves about 200,000 outpatients and 16,000 inpatients annually. The approach to a better management of hospital supplies led to the design, development, and implementation of an IT-based reengineered and integrated purchase and inventory management system. The new system has given GCRI a saving of about 8 percent of its annual costs of purchases, and improved the availability of materials to the user departments. Shows that the savings obtained are used not only for buying more hospital supplies, but also to buy better quality of hospital supplies, and thereby satisfactorily address the GCRI responsibility towards meeting its social obligations for cancer care.

  10. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  11. Re-engineering of Products and Processes How to Achieve Global Success in the Changing Marketplace

    CERN Document Server

    Rotini, Federico; Cascini, Gaetano

    2012-01-01

    Whilst previous methods for business process re-engineering have focused on time and cost reduction policies to preserve competitive services and products, Re-engineering of Products and Processes: How to Achieve Global Success in the Changing Marketplace presents a new approach which aims to include aspects that impact the customer perceived value. This method supports business re-engineering initiatives by identifying process bottlenecks as well as new products and services available to overcome market competition. This original approach is described step-by-step, explaining the theory through examples of performable tasks and the selection of relevant tools according to the nature of the problem. Supported by illustrations, tables and diagrams, Re-engineering of Products and Processes: How to Achieve Global Success in the Changing Marketplace clearly explains a method which is then applied to several case studies across different industrial sectors. Re-engineering of Products and Processes: How to Achieve...

  12. Conceptual Framework of Business Process Reengineering for Civil ...

    African Journals Online (AJOL)

    Tesfaye Deb

    endorsed Business Process Reengineering (BPR) as a foundation for strengthening Result Based ... Ethiopian government recognized the importance of improving ...... finance process. 4. Project process. (research and consultancy). Low frequency, request arrival is random, time interval between two requests can be very.

  13. Reengineering a PC-based System into the Mobile Device Product Line

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stanislaw; Loughran, Neil

    2003-01-01

    There is a growing demand to port existing PC-based software systems to mobile device platforms. Systems running on mobile devices share basic characteristics with their PC-based counterparts, but differ from them in details of user interfaces, application models, etc. Systems running on mobile...... devices must also perform well using less memory than PC-based systems. Mobile devices themselves are different from each other in many ways, too. We describe how we made an existing PC-based City Guide System available on a wide range of mobile devices, in a cost-effective way. We applied "reengineering...... into a product line architecture" approach to achieve the goal. Our product line architecture facilitates reuse via generation. We generate specific City Guide Systems for target platforms including PC, Pocket PC and other mobile devices, from generic meta-components that form the City Guide System product line...

  14. Accountability-based reengineering of an order fulfillment process

    NARCIS (Netherlands)

    Zhang, L.; Jiao, J.; Ma, Q.

    2009-01-01

    In view of the dynamic changes in a supply chain network and the significance of order fulfillment processes (OFPs) for the successful implementation of supply chain management, this paper puts forward an accountability-based methodology for companies to reengineer OFPs while considering both

  15. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Volume 2 consists of nine appendices which contain the Process Team reports and Benchmarking reports.

  16. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 2

    International Nuclear Information System (INIS)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Volume 2 consists of nine appendices which contain the Process Team reports and Benchmarking reports

  17. The complementariness of the business process reengineering and activity-based management

    Directory of Open Access Journals (Sweden)

    Violeta DOMANOVIC

    2010-05-01

    Full Text Available In order to sustain long term growth and development, an enterprise has toenvisage and implement contemporary management innovations altogether. Intransition economies, like Serbia is, it is of great importance to redesign businessprocesses and activities, to analyse activity profitability in order to select value-addedactivities and reduce non-value added ones. This paper considers the possibility forcomplementary implementation of the business process reengineering and activitybased management in the process of long term efficiency improvement. Namely, thebasic postulate of business process reengineering concept might be established in theprocess of activity based management implementation and conversely.

  18. Business Process Reengineering, a Crisis Solution or a Necessity

    Directory of Open Access Journals (Sweden)

    Gabriela GHEORGHE

    2012-08-01

    Full Text Available This case study shows that the company decided to implement Business Process Reengineering (BPR not only because external environment had changed, but also due to its obsolete business processes and organizational structure. The article will highlight the importance of the organizations' focusing on sub-goals, in order to finally reach the desired result in the organization's main goals. When rapid evolution has become the fundamental contemporary coordinate, reengineering is a form of company innovative reaction in terms of intensifying competition and globalization. Remodeling the Company in phases of crisis, when time pressure reduces the type and number of solutions that can be adopted, without effective leadership, can lead in most cases to failure. The effect of redesigning the business processes depends on how well it is implemented, coordinated and monitored.

  19. Business Process Reengineering, a Crises Solution or a Necessity

    Directory of Open Access Journals (Sweden)

    Gabriela GHEORGHE

    2011-11-01

    Full Text Available This case study shows that the company decided to implement Business Process Reengineering (BPR not only because external environment had changed, but also due to its obsolete business processes and organizational structure. The article will highlight the importance of the organizations' focusing on sub-goals, in order to finally reach the desired result in the organization's main goals. When rapid evolution has become the fundamental contemporary coordinate, reengineering is a form of company innovative reaction in terms of intensifying competition and globalization. Remodelling the Company in phases of crisis, when time pressure reduces the type and number of solutions that can be adopted, without effective leadership can lead in most cases to failure. The effect of redesigning the business processes depends on how well it is implemented, coordinated and monitored.

  20. Reengineering and health physics within the project Hanford management contract

    International Nuclear Information System (INIS)

    Atencio, E.M.

    1997-01-01

    The impending transition of the Hartford Site management and operations (M ampersand O) contract to a management and integrating (M ampersand I) contract format, together with weak radiological performance assessments by external organizations and reduced financial budgets prompted the 're-engineering' of the previous Hanford prime contractor Radiological Control (Rad Con) organization. This paper presents the methodology, identified areas of improvements, and results of the re-engineering process. The conversion from the M ampersand O to the M ampersand I contract concept resulted in multiple independent Rad Con organizations reporting to separate major contractors who are managed by an integrating contractor. This brought significant challenges when establishing minimum site standards for sitewide consistency, developing roles and responsibilities, and maintaining site Rad Con goals. Championed by the previous contractor's Rad Con Director, Denny Newland, a five month planning effort was executed to address the challenges of the M ampersand I and to address identified weaknesses. Fluor Daniel Hanford assumed the responsibility as integrator of the Project Hanford Management Contract on October 1, 1996. The Fluor Daniel Hanford Radiation Protection Director Jeff Foster presents the results of the re-engineering effort, including the significant cost savings, process improvements, field support improvements, and clarification of roles and responsibilities that have been achieved

  1. Outsourcing the development of specific application software using the ESA software engineering standards the SPS software Interlock System

    CERN Document Server

    Denis, B

    1995-01-01

    CERN is considering outsourcing as a solution to the reduction of staff. To need to re-engineer the SPS Software Interlock System provided an opportunity to explore the applicability of outsourcing to our specific controls environment and the ESA PSS-05 standards were selected for the requirements specification, the development, the control and monitoring and the project management. The software produced by the contractor is now fully operational. After outlining the scope and the complexity of the project, a discussion on the ESA PSS-05 will be presented: the choice, the way these standards improve the outsourcing process, the quality induced but also the need to adapt them and their limitation in the definition of the customer-supplier relationship. The success factors and the difficulties of development under contract will also be discussed. The maintenance aspect and the impact on in-house developments will finally be addressed.

  2. Spatial confidentiality and GIS: re-engineering mortality locations from published maps about Hurricane Katrina

    Directory of Open Access Journals (Sweden)

    Leitner Michael

    2006-10-01

    Full Text Available Abstract Background Geographic Information Systems (GIS can provide valuable insight into patterns of human activity. Online spatial display applications, such as Google Earth, can democratise this information by disseminating it to the general public. Although this is a generally positive advance for society, there is a legitimate concern involving the disclosure of confidential information through spatial display. Although guidelines exist for aggregated data, little has been written concerning the display of point level information. The concern is that a map containing points representing cases of cancer or an infectious disease, could be re-engineered back to identify an actual residence. This risk is investigated using point mortality locations from Hurricane Katrina re-engineered from a map published in the Baton Rouge Advocate newspaper, and a field team validating these residences using search and rescue building markings. Results We show that the residence of an individual, visualized as a generalized point covering approximately one and half city blocks on a map, can be re-engineered back to identify the actual house location, or at least a close neighbour, even if the map contains little spatial reference information. The degree of re-engineering success is also shown to depend on the urban characteristic of the neighborhood. Conclusion The results in this paper suggest a need to re-evaluate current guidelines for the display of point (address level data. Examples of other point maps displaying health data extracted from the academic literature are presented where a similar re-engineering approach might cause concern with respect to violating confidentiality. More research is also needed into the role urban structure plays in the accuracy of re-engineering. We suggest that health and spatial scientists should be proactive and suggest a series of point level spatial confidentiality guidelines before governmental decisions are made

  3. Spatial confidentiality and GIS: re-engineering mortality locations from published maps about Hurricane Katrina.

    Science.gov (United States)

    Curtis, Andrew J; Mills, Jacqueline W; Leitner, Michael

    2006-10-10

    Geographic Information Systems (GIS) can provide valuable insight into patterns of human activity. Online spatial display applications, such as Google Earth, can democratise this information by disseminating it to the general public. Although this is a generally positive advance for society, there is a legitimate concern involving the disclosure of confidential information through spatial display. Although guidelines exist for aggregated data, little has been written concerning the display of point level information. The concern is that a map containing points representing cases of cancer or an infectious disease, could be re-engineered back to identify an actual residence. This risk is investigated using point mortality locations from Hurricane Katrina re-engineered from a map published in the Baton Rouge Advocate newspaper, and a field team validating these residences using search and rescue building markings. We show that the residence of an individual, visualized as a generalized point covering approximately one and half city blocks on a map, can be re-engineered back to identify the actual house location, or at least a close neighbour, even if the map contains little spatial reference information. The degree of re-engineering success is also shown to depend on the urban characteristic of the neighborhood. The results in this paper suggest a need to re-evaluate current guidelines for the display of point (address level) data. Examples of other point maps displaying health data extracted from the academic literature are presented where a similar re-engineering approach might cause concern with respect to violating confidentiality. More research is also needed into the role urban structure plays in the accuracy of re-engineering. We suggest that health and spatial scientists should be proactive and suggest a series of point level spatial confidentiality guidelines before governmental decisions are made which may be reactionary toward the threat of revealing

  4. The effect of business process reengineering (BPR) on human ...

    African Journals Online (AJOL)

    The effect of business process reengineering (BPR) on human resource management in Addis Ababa City Administration. ... The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, ...

  5. THEORETICAL ASPECTS OF REENGINEERING IN SMALL AND MEDIUM ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Slobodan Stefanović

    2014-01-01

    Full Text Available Reengineering is a fundamental rethinking and radical redesign of business processes - to achieve dramatic improvements in critical, important measures of performances, such as cost, quality, service and speed. This definition contains four keywords: fundamental, radical, dramatic and processes.

  6. E-learning and the Educational Organizations Structure Reengineering (EOSR

    Directory of Open Access Journals (Sweden)

    Osama Alshara

    2007-06-01

    Full Text Available There are many calls for innovative learning methods that utilize advanced technologies. However, we will raise fundamental questions that look deep into the future of the educational organization. Can the educational institute survive without adapting learning technologies? Would the educational institute succeed in adapting new learning technologies without changing its organizational structure and processes? We claim that the answer to both questions is no. Our research will present the need for edu-cational institutes to incorporate learning technologies and focuses on the demand for the educational organization structure reengineering as a basic requirement for the suc-cess of incorporating learning technologies. Our study ex-plores the faculty requirements and policies and procedures of educational institutes in the UAE.The paper concludes with some discussions on findings from a case study of the need of educational organization struc-ture reengineering as a basic requirement for incorporating learning technologies.

  7. Biocybrid systems and the re-engineering of life

    Science.gov (United States)

    Domingues, Diana; Ferreira da Rocha, Adson; Hamdan, Camila; Augusto, Leci; Miosso, Cristiano Jacques

    2011-03-01

    The reengineering of life expanded by perceptual experiences in the sense of presence in Virtual Reality and Augmented Reality is the theme of our investigation in collaborative practices confirming the artistś creativity close to the inventivity of scientists and mutual capacity for the generation of biocybrid systems. We consider the enactive bodily interfaces for human existence being co-located in the continuum and symbiotic zone between body and flesh - cyberspace and data - and the hybrid properties of physical world. That continuum generates a biocybrid zone (Bio+cyber+hybrid) and the life is reinvented. Results reaffirm the creative reality of coupled body and mutual influences with environment information, enhancing James Gibson's ecological perception theory. The ecosystem life in its dynamical relations between human, animal, plants, landscapes, urban life and objects, bring questions and challenges for artworks and the reengineering of life discussed in our artworks in technoscience. Finally, we describe an implementation in which the immersion experience is enhanced by the datavisualization of biological audio signals and by using wearable miniaturized devices for biofeedback.

  8. A Case Study: Business Process Reengineering at Raymond W. Bliss Army Community Hospital

    Science.gov (United States)

    1997-05-01

    Among the causes listed are: • Inadequate Management of Resistance • Attempting Painless Reengineering • Lack of Understanding About Reengineering...Parathyroid Proc. 0.9554 $5,458 1 0.9554 $5,458 SD 290 Thyroid Proc 0.9362 $5,349 3 2.8086 $16,046 SD 291 Thyroglossal Proc 0.4657 $2,661 2 0.9314 $5,321...Tissues Breast Age 0-17 0.6146 $2,799 SD 284 Minor skin disord. w/o CC 0.4042 $1,841 SD 289 Parathyroid Proc. 0.9554 $4,351 SD 290 Thyroid Proc

  9. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  10. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Benchmarking of a commercial nuclear facility, a commercial research facility, and a DOE research facility was conducted to both validate the efficacy of these findings and seek additional ideas for improvement. The outcome of this evaluation is represented by the 15 final recommendations that are described in this report.

  11. Reengineering of waste management at the Oak Ridge National Laboratory. Volume 1

    International Nuclear Information System (INIS)

    Myrick, T.E.

    1997-08-01

    A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Site Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Benchmarking of a commercial nuclear facility, a commercial research facility, and a DOE research facility was conducted to both validate the efficacy of these findings and seek additional ideas for improvement. The outcome of this evaluation is represented by the 15 final recommendations that are described in this report

  12. IDC Re-Engineering Phase 2 Glossary Version 1.3

    Energy Technology Data Exchange (ETDEWEB)

    Young, Christopher J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    This document contains the glossary of terms used for the IDC Re-Engineering Phase 2 project. This version was created for Iteration E3. The IDC applies automatic processing methods in order to produce, archive, and distribute standard IDC products on behalf of all States Parties.

  13. Educational Process Reengineering and Diffusion of Innovation in Formal Learning Environment

    DEFF Research Database (Denmark)

    Khalid, Md. Saifuddin; Hossain, Mohammad Shahadat; Rongbutsri, Nikorn

    2011-01-01

    administration and evaluation and assessment. Educational environments are flexible and not governed by standard operating procedures, making technology use lithe. Theory of diffusion of innovations‟ is recommended to be integrated to reason and measure acceptance or rejection of EPR selected technology......In technology mediated learning while relative advantages of technologies is proven, lack of contextualization and process centric change, and lack of user driven change has kept intervention and adoption of educational technologies among individuals and organizations as challenges. Reviewing...... the formal, informal and non-formal learning environments, this study focuses on the formal part. This paper coins the term 'Educational Process Reengineering (EPR) based on the established concept of 'Business Process Reengineering (BPR) for process improvement of teaching learning activities, academic...

  14. Business Process Reengineering: A Primer for the Marine Corps' Process Owner

    National Research Council Canada - National Science Library

    Brewster, Rollin

    1997-01-01

    .... Business Process Reengineering (BPR) is a technique used by the private sector to achieve order of magnitude improvements in organizational performance by leveraging information technology to enable the holistic redesign of business processes...

  15. Digital image processing in the nuclear field with ImaWin 5.0

    International Nuclear Information System (INIS)

    Marajofsky, A.; Trafelati, A.A.; Lavagnino, C.E.

    2000-01-01

    ImaWin is a software project designed to cover a broad set of applications of Digital Image Processing in the Nuclear Field. Since 1994 the system has evolved in a complete tool that helped to face problems like densitometry calculus, quality control in pellets, deposit administration and surveillance. Neural network kernel and ImaScript scripting language are included within the package. The open and incremental development of ImaWin software has been allowing easy expansion upon a common re-engineering framework. (author)

  16. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  17. Development of a testlet generator in re-engineering the Indonesian physics national-exams

    Science.gov (United States)

    Mindyarto, Budi Naini; Mardapi, Djemari; Bastari

    2017-08-01

    The Indonesian Physics national-exams are end-of-course summative assessments that could be utilized to support the assessment for learning in physics educations. This paper discusses the development and evaluation of a testlet generator based on a re-engineering of Indonesian physics national exams. The exam problems were dissected and decomposed into testlets revealing the deeper understanding of the underlying physical concepts by inserting a qualitative question and its scientific reasoning question. A template-based generator was built to facilitate teachers in generating testlet variants that would be more conform to students' scientific attitude development than their original simple multiple-choice formats. The testlet generator was built using open source software technologies and was evaluated focusing on the black-box testing by exploring the generator's execution, inputs and outputs. The results showed the correctly-performed functionalities of the developed testlet generator in validating inputs, generating testlet variants, and accommodating polytomous item characteristics.

  18. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  19. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    Science.gov (United States)

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  20. As Easy as ABC: Re-engineering the Cost Accounting System.

    Science.gov (United States)

    Trussel, John M.; Bitner, Larry N.

    1996-01-01

    To be useful for management decision making, the college or university's cost accounting system must capture and measure improvements. Activity-based costing (ABC), which determines more accurately the full costs of services and products, tracks improvements and should proceed alongside reengineering of institutional accounting. Guidelines are…

  1. Sculpting carbon bonds for allotropic transformation through solid-state re-engineering of –sp2 carbon

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hyun Young; Araujo, Paulo T.; Kim, Young Lae; Jung, Sung Mi; Jia, Xiaoting; Hong, Sanghyun; Ahn, Chi Won; Kong, Jing; Dresselhaus, Mildred S.; Kar, Swastik; Jung, Yung Joon

    2014-09-15

    Carbon forms one of nature’s strongest chemical bonds; its allotropes having provided some of the most exciting scientific discoveries in recent times. The possibility of inter-allotropic transformations/hybridization of carbon is hence a topic of immense fundamental and technological interest. Such modifications usually require extreme conditions (high temperature, pressure and/or high-energy irradiations), and are usually not well controlled. Here we demonstrate inter-allotropic transformations/hybridizations of specific types that appear uniformly across large-area carbon networks, using moderate alternating voltage pulses. By controlling the pulse magnitude, small-diameter single-walled carbon nanotubes can be transformed predominantly into larger-diameter single-walled carbon nanotubes, multi-walled carbon nanotubes of different morphologies, multi-layered graphene nanoribbons or structures with sp3 bonds. This re-engineering of carbon bonds evolves via a coalescence-induced reconfiguration of sp2 hybridization, terminates with negligible introduction of defects and demonstrates remarkable reproducibility. This reflects a potential step forward for large-scale engineering of nanocarbon allotropes and their junctions.

  2. Design implications for task-specific search utilities for retrieval and re-engineering of code

    Science.gov (United States)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  3. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  4. Marketing Cooperatives' Re-engineering: Influences among Organizational Attributes, Strategic Attributes & Performance

    NARCIS (Netherlands)

    Benos, T.; Kalogeras, N.; Verhees, F.J.H.M.; Pennings, J.M.E.

    2009-01-01

    ABSTRACT In this paper we expand the agribusiness co-op literature by studying the re-engineering process of marketing cooperatives (co-ops). More specifically we discuss and empirically examine organizational innovations adopted by marketing co-ops in Greece. We hypothesize three types of

  5. Re-Engineering a High Performance Electrical Series Elastic Actuator for Low-Cost Industrial Applications

    Directory of Open Access Journals (Sweden)

    Kenan Isik

    2017-01-01

    Full Text Available Cost is an important consideration when transferring a technology from research to industrial and educational use. In this paper, we introduce the design of an industrial grade series elastic actuator (SEA performed via re-engineering a research grade version of it. Cost-constrained design requires careful consideration of the key performance parameters for an optimal performance-to-cost component selection. To optimize the performance of the new design, we started by matching the capabilities of a high-performance SEA while cutting down its production cost significantly. Our posit was that performing a re-engineering design process on an existing high-end device will significantly reduce the cost without compromising the performance drastically. As a case study of design for manufacturability, we selected the University of Texas Series Elastic Actuator (UT-SEA, a high-performance SEA, for its high power density, compact design, high efficiency and high speed properties. We partnered with an industrial corporation in China to research the best pricing options and to exploit the retail and production facilities provided by the Shenzhen region. We succeeded in producing a low-cost industrial grade actuator at one-third of the cost of the original device by re-engineering the UT-SEA with commercial off-the-shelf components and reducing the number of custom-made parts. Subsequently, we conducted performance tests to demonstrate that the re-engineered product achieves the same high-performance specifications found in the original device. With this paper, we aim to raise awareness in the robotics community on the possibility of low-cost realization of low-volume, high performance, industrial grade research and education hardware.

  6. BUSINESS PROCESS REENGINEERING: CONCEPTS CAUSES AND EFFECT

    Directory of Open Access Journals (Sweden)

    Bernardo Nugroho Yahya

    2002-01-01

    Full Text Available Some people made a wrong concept about Business Process Reengineering (BPR. Some were misunderstanding about the BPR term. In other way, so many researches were introduced to describe a better definition about BPR. The thinking about concepts, causes, and effect of BPR will make a new perception about the term of BPR itself as a better methodology instead of the other Quality Management Methodology such as Total Quality Management (TQM, Just In Time (JIT, etc. This paper will mention the context of BPR in some of case study's journal.

  7. Downsizing, reengineering and patient safety: numbers, newness and resultant risk.

    Science.gov (United States)

    Knox, G E; Kelley, M; Hodgson, S; Simpson, K R; Carrier, L; Berry, D

    1999-01-01

    Downsizing and reengineering are facts of life in contemporary healthcare organizations. In most instances, these organizational changes are undertaken in an attempt to increase productivity or cut operational costs with results measured in these terms. Less often considered are potential detrimental effects on patient safety or strategies, which might be used to minimize these risks.

  8. A business process modeling experience in a complex information system re-engineering.

    Science.gov (United States)

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  9. Re-Engineering Complex Legacy Systems at NASA

    Science.gov (United States)

    Ruszkowski, James; Meshkat, Leila

    2010-01-01

    The Flight Production Process (FPP) Re-engineering project has established a Model-Based Systems Engineering (MBSE) methodology and the technological infrastructure for the design and development of a reference, product-line architecture as well as an integrated workflow model for the Mission Operations System (MOS) for human space exploration missions at NASA Johnson Space Center. The design and architectural artifacts have been developed based on the expertise and knowledge of numerous Subject Matter Experts (SMEs). The technological infrastructure developed by the FPP Re-engineering project has enabled the structured collection and integration of this knowledge and further provides simulation and analysis capabilities for optimization purposes. A key strength of this strategy has been the judicious combination of COTS products with custom coding. The lean management approach that has led to the success of this project is based on having a strong vision for the whole lifecycle of the project and its progress over time, a goal-based design and development approach, a small team of highly specialized people in areas that are critical to the project, and an interactive approach for infusing new technologies into existing processes. This project, which has had a relatively small amount of funding, is on the cutting edge with respect to the utilization of model-based design and systems engineering. An overarching challenge that was overcome by this project was to convince upper management of the needs and merits of giving up more conventional design methodologies (such as paper-based documents and unwieldy and unstructured flow diagrams and schedules) in favor of advanced model-based systems engineering approaches.

  10. Re-engineering caused by ISO-9000 certification

    DEFF Research Database (Denmark)

    Hvam, Lars; Nielsen, Anders Paarup; Bjarnø, Ole-Christian

    1997-01-01

    Based on a project performed at a medium-sized producer of medical utensils, reviews some of the problems which the company experienced in connection with the system built up during ISO 9001 certification, and the re-engineering efforts which were performed in order to relieve these problems....... Focuses in particular on a re-structuring of the company’s system for production documentation and its relation to the traceability of their products. This system was radically altered during the project without the traceability requirements being violated or reduced. These changes resulted in a marked...... increase in productivity....

  11. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    Science.gov (United States)

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  12. Gaming and simulation for transforming and reengineering government : Towards a research agenda

    NARCIS (Netherlands)

    Janssen, M.F.W.H.A.; Klievink, B.

    2010-01-01

    Purpose – In the process of transformation, governments have to deal with a host of stakeholders and complex organizational and technical issues. In this viewpoint paper, an argument is made in favour of using gaming and simulation as tools designed to aid the transformation and reengineering of

  13. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  14. Revisiting software ecosystems research

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    Software ecosystems’ is argued to first appear as a concept more than 10 years ago and software ecosystem research started to take off in 2010. We conduct a systematic literature study, based on the most extensive literature review in the field up to date, with two primarily aims: (a) to provide...... an updated overview of the field and (b) to document evolution in the field. In total, we analyze 231 papers from 2007 until 2014 and provide an overview of the research in software ecosystems. Our analysis reveals a field that is rapidly growing both in volume and empirical focus while becoming more mature...... from evolving. We propose means for future research and the community to address them. Finally, our analysis shapes the view of the field having evolved outside the existing definitions of software ecosystems and thus propose the update of the definition of software ecosystems....

  15. Reengineering the Innovation Culture through Social media Crowdsourcing

    DEFF Research Database (Denmark)

    Scupola, Ada; Nicolajsen, Hanne Westh

    2012-01-01

    In this article we investigate how social media-based crowdsourcing systems can be used to reengineer the innovation culture in an organization. Based on a case study of a large engineering consultancy’s use of a social media crowdsourcing system we investigate the impact on the organizations...... innovation culture using theory on organizational culture and crowdsourcing. The analysis shows that the organizational crowdsourcing event has supported an innovation culture change in the case company towards a more including approach to innovation; creating a new and different awareness of innovation...

  16. Reengineering the Innovation Culture through Social media Crowdsourcing

    DEFF Research Database (Denmark)

    Scupola, Ada; Nicolajsen, Hanne Westh

    2012-01-01

    innovation culture using theory on organizational culture and crowdsourcing. The analysis shows that the organizational crowdsourcing event has supported an innovation culture change in the case company towards a more including approach to innovation; creating a new and different awareness of innovation......In this article we investigate how social media-based crowdsourcing systems can be used to reengineer the innovation culture in an organization. Based on a case study of a large engineering consultancy’s use of a social media crowdsourcing system we investigate the impact on the organizations...

  17. Improving Software Engineering on NASA Projects

    Science.gov (United States)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  18. RE-ENGINEERING PRIMARY HEALTHCARE NURSING AS A FIRST CAREER CHOICE.

    Science.gov (United States)

    Wheeler, Emily; Govan, Linda

    2016-08-01

    In line with international models and critical to the primary healthcare nursing workforce, the Australian Primary Health Care Nursing Association (APNA) has been funded by the Commonwealth Department of Health to develop an Education and Career Framework and Toolkit for primary healthcare nurses. The aim of the project is to improve the recruitment and retention of nurses and to re-engineer primary healthcare as a first choice career option.

  19. Re-engineering the urban drainage system for resource recovery and protection of drinking water supplies.

    Science.gov (United States)

    Gumbo, B

    2000-01-01

    The Harare metropolis in Zimbabwe, extending upstream from Manyame Dam in the Upper Manyame River Basin, consists of the City of Harare and its satellite towns: Chitungwiza, Norton, Epworth and Ruwa. The existing urban drainage system is typically a single-use-mixing system: water is used and discharged to "waste", excreta are flushed to sewers and eventually, after "treatment", the effluent is discharged to a drinking water supply source. Polluted urban storm water is evacuated as fast as possible. This system not only ignores the substantial value in "waste" materials, but it also exports problems to downstream communities and to vulnerable fresh-water sources. The question is how can the harare metropolis urban drainage system, which is complex and has evolved over time, be rearranged to achieve sustainability (i.e. water conservation, pollution prevention at source, protection of the vulnerable drinking water sources and recovery of valuable materials)? This paper reviews current concepts regarding the future development of the urban drainage system in line with the new vision of "Sustainable Cities of the Future". The Harare Metropolis in Zimbabwe is taken as a case, and philosophical options for re-engineering the drainage system are discussed.

  20. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  1. Software that meets its Intent

    NARCIS (Netherlands)

    Huisman, Marieke; Bos, Herbert; Brinkkemper, Sjaak; van Deursen, Arie; Groote, Jan Friso; Lago, Patricia; van de Pol, Jaco; Visser, Eelco; Margaria, Tiziana; Steffen, Bernhard

    2016-01-01

    Software is widely used, and society increasingly depends on its reliability. However, software has become so complex and it evolves so quickly that we fail to keep it under control. Therefore, we propose intents: fundamental laws that capture a software systems’ intended behavior (resilient,

  2. How can usability measurement affect the re-engineering process of clinical software procedures?

    Science.gov (United States)

    Terazzi, A; Giordano, A; Minuco, G

    1998-01-01

    As a consequence of the dramatic improvements achieved in information technology standards in terms of single hardware and software components, efforts in the evaluation processes have been focused on the assessment of critical human factors, such as work-flow organisation, man-machine interaction and, in general, quality of use, or usability. This trend is particularly valid when applied to medical informatics, since the human component is the basis of the information processing system in health care context. With the aim to establish an action-research project on the evaluation and assessment of clinical software procedures which constitute an integrated hospital information system, the authors adopted this strategy and considered the measurement of perceived usability as one of the main goals of the project itself: the paper reports the results of this experience.

  3. Smartphone-based biosensing platform evolution: implementation of electrochemical analysis capabilities

    DEFF Research Database (Denmark)

    Patou, François; Dimaki, Maria; Svendsen, Winnie Edith

    2016-01-01

    Lab-on-Chip technologies offer great opportunities for the democratization of in-vitro medical diagnostics to the consumer-market. Despite the limitations set by the strict instrumentation and control requirements of certain families of these devices, new solutions are emerging. Smartphones now...... routinely demonstrate their potential as an interface of choice for operating complex, instrumented Lab-on-Chips. The sporadic nature of home-based in-vitro medical diagnostics testing calls for the development of systems capable of evolving with new applications or new technologies for Lab-on-Chip devices....... We present in this work how we evolved the first generation of a smartphone/Lab-on-Chip platform designed for evolvability. We demonstrate how reengineering efforts can be confined to the mobile-software layer and illustrate some of the benefits of building evolvable systems. We implement...

  4. Refactoring, reengineering and evolution: paths to Geant4 uncertainty quantification and performance improvement

    International Nuclear Information System (INIS)

    Batič, M; Hoff, G; Pia, M G; Saracco, P; Begalli, M; Han, M; Kim, C H; Seo, H; Hauf, S; Kuster, M; Weidenspointner, G; Zoglauer, A

    2012-01-01

    Ongoing investigations for the improvement of Geant4 accuracy and computational performance resulting by refactoring and reengineering parts of the code are discussed. Issues in refactoring that are specific to the domain of physics simulation are identified and their impact is elucidated. Preliminary quantitative results are reported.

  5. IDC reengineering Phase 2 & 3 US industry standard cost estimate summary

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Huelskamp, Robert M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, using a commercial software cost estimation tool calibrated to US industry performance parameters. This is not a cost estimate for Sandia to perform the project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  6. Greek Co-ops' Re-Engineering: Exploring the Influences among Organizational Attributes, Strategic Attributes, and Performance

    NARCIS (Netherlands)

    Benos, T.; Kalogeras, N.; Verhees, F.J.H.M.

    2007-01-01

    Abstract We develop an actual classification entailing traditional vs. reengineered cooperative organizational attributes. Using this classification, we conceptualize and empirically investigate three types of relationships: a) organizational (i.e., collective ownership, control and cost/benefit

  7. The role of business process reengineering in health care.

    Science.gov (United States)

    Kohn, D

    1994-02-01

    Business process reengineering (BPR) is a management philosophy capturing attention in health care. It combines some new, old, and recycled management philosophies, and, more often than not, is yielding positive results. BPR's emphasis is on the streamlining of cross-functional processes to significantly reduce time and/or cost, increase revenue, improve quality and service, and reduce risk. Therefore, it has many applications in health care. This article provides an introduction to the concept of BPR, including the definition of BPR, its origin, its champions, and factors for its success.

  8. Reforms in Education: The Need for Re-Engineering Teacher Education for Sustainable Development

    Science.gov (United States)

    Ofoego, O. C.; Ebebe, I. E.

    2016-01-01

    The paper is concerned with reforms in Education and the need for re-engineering Teacher education in Nigeria for better professionalism and National Development. In the process, key concepts like Teacher Education and professionalism were explained. A brief review of the state of Teacher Education and Development in Nigeria revealed the…

  9. WSC-07: Evolving the Web Services Challenge

    NARCIS (Netherlands)

    Blake, M. Brian; Cheung, William K.W.; Jaeger, Michael C.; Wombacher, Andreas

    Service-oriented architecture (SOA) is an evolving architectural paradigm where businesses can expose their capabilities as modular, network-accessible software services. By decomposing capabilities into modular services, organizations can share their offerings at multiple levels of granularity

  10. Reengineering the picture archiving and communication system (PACS) process for digital imaging networks PACS.

    Science.gov (United States)

    Horton, M C; Lewis, T E; Kinsey, T V

    1999-05-01

    Prior to June 1997, military picture archiving and communications systems (PACS) were planned, procured, and installed with key decisions on the system, equipment, and even funding sources made through a research and development office called Medical Diagnostic Imaging Systems (MDIS). Beginning in June 1997, the Joint Imaging Technology Project Office (JITPO) initiated a collaborative and consultative process for planning and implementing PACS into military treatment facilities through a new Department of Defense (DoD) contract vehicle called digital imaging networks (DIN)-PACS. The JITPO reengineered this process incorporating multiple organizations and politics. The reengineered PACS process administered through the JITPO transformed the decision process and accountability from a single office to a consultative method that increased end-user knowledge, responsibility, and ownership in PACS. The JITPO continues to provide information and services that assist multiple groups and users in rendering PACS planning and implementation decisions. Local site project managers are involved from the outset and this end-user collaboration has made the sometimes difficult transition to PACS an easier and more acceptable process for all involved. Corporately, this process saved DoD sites millions by having PACS plans developed within the government and proposed to vendors second, and then having vendors respond specifically to those plans. The integrity and efficiency of the process have reduced the opportunity for implementing nonstandard systems while sharing resources and reducing wasted government dollars. This presentation will describe the chronology of changes, encountered obstacles, and lessons learned within the reengineering of the PACS process for DIN-PACS.

  11. A systematic approach for component-based software development

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis

    2000-01-01

    Component-based software development enables the construction of software artefacts by assembling prefabricated, configurable and independently evolving building blocks, called software components. This paper presents an approach for the development of component-based software artefacts. This

  12. Re-Engineering Control Systems using Automatic Generation Tools and Process Simulation: the LHC Water Cooling Case

    CERN Document Server

    Booth, W; Bradu, B; Gomez Palacin, L; Quilichini, M; Willeman, D

    2014-01-01

    This paper presents the approach used at CERN (European Organization for Nuclear Research) to perform the re-engineering of the control systems dedicated to the LHC (Large Hadron Collider) water cooling systems.

  13. The Impact of the Dimensions of the Administrative Decision Support Systems on the Re-engineering of the Systems of the Palestinian universities in Gaza Strip from the Employees’ Perspective

    Directory of Open Access Journals (Sweden)

    Mazen Jehad I. Al Shobaki

    2017-08-01

    Full Text Available This study aimed to identify the impact of the dimensions of the administrative decision support systems on the re-engineering of the systems of the Palestinian universities in Gaza Strip from the standpoint of employees. A descriptive approach was used through which a questionnaire was developed and distributed to a stratified random sample. (500 questionnaires were distributed and (449 were returned, with (89.8% response rate. The study revealed these results: There was an effect for the potentials (physical, human, technical, and organizational design available for the decision support systems and re-engineering of the systems in the Palestinian higher education institutions in Gaza Strip.There were significant differences between the assessment means of the study sample about the impact of decision support systems to re-engineer the systems in the Palestinian higher education institutions in Gaza Strip due to the gender variable in favor of males. There also differences due to the name of the university variable in favor of the Islamic University, Al Azhar University, Al Aqsa University, respectively. It was recommended that Palestinian higher education institutions which intend to start re-engineering the systems should be encouraged immediately start the process. These institutions should also develop the infrastructure of the decisions support systems when re-engineering their operations. Keywords: Decision support systems, Re-engineering, Palestinian higher education institutions.

  14. Applying object technology principles to business reengineering in the oil, gas, and petrochemical industries

    International Nuclear Information System (INIS)

    Davis, J.M.

    1996-01-01

    The oil, gas, and petrochemical industries face a dilemma, to be financially competitive while complying with strict and expanding environmental, safety, and health regulation. Companies need new tools and techniques, indeed a completely new paradigm for organizing and performing work. They must build efficient and flexible business processes, ones that rely on advanced information systems for improved decision making and productivity. And they must adopt a culture of change and improvement to permit the business to change as the business climate changes. Fortunately, two industry developments are changing the traditional business paradigm in a dramatic way; business reengineering and object technology. Applying principles of object technology in the performance of business reengineering makes available a new form of business modeling that transforms the technique of modeling a business while directly supported the development of its enabling information systems. This modeling technique is called Object Modeling and is becoming an important force in improving business competitiveness

  15. An IoT Knowledge Reengineering Framework for Semantic Knowledge Analytics for BI-Services

    Directory of Open Access Journals (Sweden)

    Nilamadhab Mishra

    2015-01-01

    Full Text Available In a progressive business intelligence (BI environment, IoT knowledge analytics are becoming an increasingly challenging problem because of rapid changes of knowledge context scenarios along with increasing data production scales with business requirements that ultimately transform a working knowledge base into a superseded state. Such a superseded knowledge base lacks adequate knowledge context scenarios, and the semantics, rules, frames, and ontology contents may not meet the latest requirements of contemporary BI-services. Thus, reengineering a superseded knowledge base into a renovated knowledge base system can yield greater business value and is more cost effective and feasible than standardising a new system for the same purpose. Thus, in this work, we propose an IoT knowledge reengineering framework (IKR framework for implementation in a neurofuzzy system to build, organise, and reuse knowledge to provide BI-services to the things (man, machines, places, and processes involved in business through the network of IoT objects. The analysis and discussion show that the IKR framework can be well suited to creating improved anticipation in IoT-driven BI-applications.

  16. The Clean Development Mechanism Re-engineered

    DEFF Research Database (Denmark)

    Lütken, Søren

    2016-01-01

    for engineering such mechanism, or indeed reengineering the CDM itself, to make it a viable mitigation financing tool, providing receipts for payments in the form of certified emission reductions (CER). Two solutions are presented, both of which secure new financing for projects that deliver real and measurable...... emissions reduction benefits on the basis of prospective revenues from emissions reduction: one introduces up-front securitization of the emissions reductions; the other builds on a defined value of the CERs without the need for a carbon price or a market for trading. Most of us use simple heuristics...... time. Simply put CERs are not project finance and do not address project capital needs when most needed — upfront. CER based returns are available only after a project is operational. That is why only one third of registered CDM projects went as far as to get their carefully calculated CERs issued...

  17. USULAN PERBAIKAN PROSES BISNIS DENGAN KONSEP BUSINESS PROCESS REENGINEERING (STUDI KASUS : PERMATA GUEST HOUSE

    Directory of Open Access Journals (Sweden)

    Bhaswara Adhitya Wardhana

    2013-04-01

    Full Text Available Permata Guest House Semarang merupakan usaha bisnis yang bergerak di bidang jasa penginapan. Perkembangan skala bisnis Permata Guest House yang pesat tidak disertai dengan penataan dan pengelolaan proses bisnis (Business Process yang memadai, mengakibatkan banyak kemunculan keluhan dari stakeholder, yaitu dari pelanggan (customer, dan kalangan internal karyawan. Key Performance Indicator (KPI yang digunakan dalam penelitian terhadap Permata Guest House ini adalah KPI yang diturunkan dari Critical Success Factor (CSF. Pengukuran kinerja dilakukan dengan berdasarkan indikator performansi (Performance Indicator yang telah ditentukan pada observasi awal beserta ukuran dan target yang dicapai untuk tiap indikatornya. Dari hasil observasi awal didapatkan selisih indikator moral kerja dan loyalitas sebesar 20%, komplain keramahtamahan sebesar 16,67%, komplain penampilan dan sikap sebesar 16,67%, tingkat okupansi marketing sales sebesar 30%, dan kepuasan layanan sebesar 77,78%. Berdasarkan analisis pengukuran kinerja dan proses bisnis telah didapatkan faktor-faktor yang menyebabkan kinerja proses bisnis belum sesuai target, salah satu penyelesaian untuk dapat memperbaiki kinerja adalah dengan merancang ulang proses bisnis dengan menggunakan metode Business Process Reengineering (BPR. Dari hasil BPR perlu dilakukan peninjauan kembali (rethinking, perancangan ulang (redesign, dan evaluasi (retool terhadap model kinerja bisnis. Hasil dari rekayasa ulang proses bisnis berupa pembakuan usulan proses bisnis, penyusunan visi misi perusahaan, perancangan struktur organisasi dan job description, serta penyusunan Standart Operating Procedure Kata Kunci  : critical success factor, key performance indicator, business process reengineering   Abstract Permata Guest House Semarang is a business engaged in the lodging services. The development of business scale rapid Permata Guest House is not accompanied by the administration and management of business processes are

  18. Proposal of an Embedded Methodology that uses Organizational Diagnosis and Reengineering: Case of bamboo panel company

    Directory of Open Access Journals (Sweden)

    Eva Selene Hernández Gress

    2017-08-01

    Full Text Available This work is an extension of the Proceedings of the International Conference on Industrial Engineering, Management Science and Applications, which presented some of the phases of Reengineering applied to Bamboo Panel Company; the results were Strategic planning, Systemic Diagnosis and Performance Indicators through the Balanced Scorecard. Now, the main purpose of this article is to present a methodology that embedding Organizational Diagnosis and Reengineering, which emphasizes the incorporation of culture, context, management style, and knowledge as well as inner and outer actors. The results of the proposed methodology applied to the case study are included, up to the moment of the writing of this article. Future work consists on the development of strategies for Innovation as a strategy planned in the Balanced Scorecard and derived from the embedded methodology.

  19. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  20. Hospital Registration Process Reengineering Using Simulation Method

    Directory of Open Access Journals (Sweden)

    Qiang Su

    2010-01-01

    Full Text Available With increasing competition, many healthcare organizations have undergone tremendous reform in the last decade aiming to increase efficiency, decrease waste, and reshape the way that care is delivered. This study focuses on the operational efficiency improvement of hospital’s registration process. The operational efficiency related factors including the service process, queue strategy, and queue parameters were explored systematically and illustrated with a case study. Guided by the principle of business process reengineering (BPR, a simulation approach was employed for process redesign and performance optimization. As a result, the queue strategy is changed from multiple queues and multiple servers to single queue and multiple servers with a prepare queue. Furthermore, through a series of simulation experiments, the length of the prepare queue and the corresponding registration process efficiency was quantitatively evaluated and optimized.

  1. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  2. High-tech organizations: What can they tell us about reengineering (grow and reproduce, or die)

    Energy Technology Data Exchange (ETDEWEB)

    Norton, F.J.

    1996-06-10

    Change is the norm of the 1990s, and it will continue to be a major factor in running a company and/or organization as the coming decades unfold. The former cycle of change followed by stability is gone; change as a continuous reality is the new cycle. The necessity to be customer-driven implies a fundamental transformation of the way organizations and their managers choose to do business. Much has been learned about the way people interact with information systems/engineering information (IS/EI) systems technologies. The cultures of the Department of Energy`s (DOE) National Laboratories are built on a research and development (R and D) mentality that greatly increases the difficulty of building an effective IS/EI systems cross-functional group for various organizations. Classical planning approaches ignore cultural and organizational factors. These factors, however, are crucial in devising meaningful and relevant plans. Also, as more and more organizations strive to become competitive, the philosophy and concepts of total quality management (TQM) are receiving increased attention. This paper: discusses the possibility of applying manufacturing reengineering techniques to other industries to help them overcome the risk of failure; provides a comprehensive look at the changes that have occurred in the business environment since the advent of reengineering; discusses why reengineering is so important and how people and executives of organizations can play even more pivotal roles as long-term strategists in there organizations; introduces the concept of the core mission to planning; provides business process redesign that takes into consideration the interaction of humans and technology.

  3. Enterprise Information Systems as a Service: Re-engineering Enterprise Software as Product-Service System

    NARCIS (Netherlands)

    Wortmann, Johan; Don, H.; Hasselman, J.; J., Wilbrink; Frick, Jan; Laugen, Bjørge Timenes

    2012-01-01

    This paper draws an analogy between developments in enterprise software and in capital goods manufacturing industry. Many branches of manufacturing industry, especially automotive industry, have grown in maturity by moving from craftsmanship to mass production. These industries subsequently move

  4. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  5. Applying Business Process Re-Engineering to Public Sector as A New Public Management Strategy

    Directory of Open Access Journals (Sweden)

    Ropinder Oberoi

    2013-08-01

    Full Text Available The introduction of Business Process Reengineering (BPR to the public sector follows the much broader trend of New Public Management. BPR in the public sector mostly means amalgamation of business processes, computerization of various activities and removal of some unnecessary ones. BPR assimilates a radical premeditated scheme of business pro-cess reengineering and an additional progressive technique of uninterrupted process improvement with adequate information technology (IT and e-business infrastructure strategies. Public organizations have specific and exclusive features that differentiae-ate them from private sector organizations. Based on the literature review and examining of study find-ings, it is argued that a public sector organization can employ BPR to get better its process and overall organizational performance, if it (1 has accrues a collection of BPR-relevant resources and capabilities; (2 has embarked on BPR with adequate depth and breadth; (3 is developing a post-BPR complementary set of skills, systems and technologies, which are essential to further develop the organizational impact of the BPR; and (4 has successfully mitigated the effects of BPR implementation problems. In addition to its effect on administration and ser-vice delivery processes through reduction of the processing time, work steps and cost of government processes, BPR also contributes to enhancing citizen/customer and employee satisfaction, increasing organizational transparency and responsiveness which have also become an essential objective of New Public Management. Therefore, public sector BPR is emerging as an indispensable to performance of organizations in the developing economy. The essential questions addressed in this paper are: What are the scenario and impending problems of reengineering applications in the public sector? Can it be functional for the public sector in attending to frequent problems blockading bureaucracies of developed and

  6. Re-engineering closing watersheds: The negotiated expansion of a dam-based irrigation system in Bolivia

    NARCIS (Netherlands)

    Rocha Lopez, R.F.; Vincent, L.F.; Rap, E.R.

    2015-01-01

    The expansion of the Totora Khocha dam-based irrigation system in the Pucara watershed is a case of planned re-engineering of a closing watershed. This article shows how, when irrigation systems expand in space and across boundaries to capture new water, they also involve new claims by existing and

  7. A Component Based Approach to Scientific Workflow Management

    CERN Document Server

    Le Goff, Jean-Marie; Baker, Nigel; Brooks, Peter; McClatchey, Richard

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse.

  8. A component based approach to scientific workflow management

    International Nuclear Information System (INIS)

    Baker, N.; Brooks, P.; McClatchey, R.; Kovacs, Z.; LeGoff, J.-M.

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta-modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse

  9. A Practical Software Architecture for Virtual Universities

    Science.gov (United States)

    Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun

    2006-01-01

    This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…

  10. Reengineering Real-Time Software Systems

    Science.gov (United States)

    1993-09-09

    Advisor : Yutaka Kanayama Approved for public release; distribution is unlimited. 93-29769 93 12 6 098 Form Appmoved REPORT DOCUMENTATION PAGE 1o No. PI rep...line...and~parabola() Queue.c inc...getinst() Queue.c readjinst() Queue.c se~_insto ImmCmd.c accO0) ImmCmd.c getjineO() ImmCmd.c get robO () ImmCmd.c

  11. Software for medical image based phantom modelling

    International Nuclear Information System (INIS)

    Possani, R.G.; Massicano, F.; Coelho, T.S.; Yoriyaz, H.

    2011-01-01

    Latest treatment planning systems depends strongly on CT images, so the tendency is that the dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) or computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET or SPECT. This information associated with the simulation of radiation transport software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran- 77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  12. Reengineering the laboratory: strategic process and systems innovation to improve performance. Recreating our role on the health-care team.

    Science.gov (United States)

    Johnson, E

    1995-01-01

    The author describes reengineering efforts in the laboratory of a 550-bed hospital. Key benefits include reduced costs, improved turnaround time, and redirection of staff into new roles in information management and outreach.

  13. Graph Based Verification of Software Evolution Requirements

    NARCIS (Netherlands)

    Ciraci, S.

    2009-01-01

    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the

  14. Evaluation Policy Alternatives for the Reengineering of the Department of Defense Personal Property Shipment and Storage Program - A Stakeholder Approach

    National Research Council Canada - National Science Library

    Lepson, Michael

    1999-01-01

    ...) to evaluate the personal property pilot programs as part of Management Reform Memorandum # 6. This thesis evaluates the policy alternatives for reengineering the DOD personal property program using a stakeholder approach...

  15. Process Reengineering of Cold Chain Logistics of Agricultural Products Based on Low-carbon Economy

    OpenAIRE

    Guo, Hong-xia; Shao, Ming

    2012-01-01

    Through the process analysis of cold chain logistics of agricultural products, we find that cold chain logistics of agricultural products contradict the development model of low-carbon economy to some extent. We apply the development idea of low-carbon economy, introduce the third-party logistics companies, establish distribution center of cold chain logistics of agricultural products, and strengthen information sharing, to reengineer the process of cold chain logistics of agricultural produc...

  16. Scope and prospects of re-engineering and retrofitting wind farms in India

    International Nuclear Information System (INIS)

    Rajsekhar, B.; Van Hulle, F.J.L.

    2001-09-01

    The paper starts with a brief analysis of the characteristics of the Indian wind energy programmes while enumerating the developments that have taken place so far. In view of the large scope for renewable energy based power generation and in order to boost the present uprise of the wind farm development, the authors investigate the possibilities that lay in re-engineering of existing wind farms. Existing wind farm entrepreneurs are showing interest to improve the performance of their wind farms. New initiatives are suggested addressing the involved technical and commercial concerns of both the state-run utility (the principal customer of wind generated electricity) and wind farm entrepreneur to spur development of economically competitive wind-power plants In addition, inferences are drawn from a recently conducted detailed case study at a 5 year old large wind farm in Muppandal area. The study involved conducting detailed WAsP based analysis based on remote land use and land cover details interfacing with GIS. In addition, detailed site investigations were conducted to assess the health of the machines and the adequacy of the power evacuation facility together with the analysis of the machine down times. The paper highlights the benefits that can be expected from such undertakings for several parties both in India and in EU. The paper finally outlines the possible business opportunities and economic benefits that exist for retrofitting and re-engineering in the country, which has over 700 individually designed wind farms. 2 refs

  17. Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.

    Science.gov (United States)

    Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth

    2017-03-01

    Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications

    Science.gov (United States)

    Patou, François; AlZahra’a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E.

    2016-01-01

    The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods. PMID:27598208

  19. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications.

    Science.gov (United States)

    Patou, François; AlZahra'a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E

    2016-09-03

    The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.

  20. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications

    Directory of Open Access Journals (Sweden)

    François Patou

    2016-09-01

    Full Text Available The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.

  1. Re-engineering the mission life cycle with ABC and IDEF

    Science.gov (United States)

    Mandl, Daniel; Rackley, Michael; Karlin, Jay

    1994-01-01

    The theory behind re-engineering a business process is to remove the non-value added activities thereby lowering the process cost. In order to achieve this, one must be able to identify where the non-value added elements are located which is not a trivial task. This is because the non-value added elements are often hidden in the form of overhead and/or pooled resources. In order to be able to isolate these non-value added processes from among the other processes, one must first decompose the overall top level process into lower layers of sub-processes. In addition, costing data must be assigned to each sub-process along with the value the sub-process adds towards the final product. IDEF0 is a Federal Information Processing Standard (FIPS) process-modeling tool that allows for this functional decomposition through structured analysis. In addition, it illustrates the relationship of the process and the value added to the product or service. The value added portion is further defined in IDEF1X which is an entity relationship diagramming tool. The entity relationship model is the blueprint of the product as it moves along the 'assembly line' and therefore relates all of the parts to each other and the final product. It also relates the parts to the tools that produce the product and all of the paper work that is used in their acquisition. The use of IDEF therefore facilitates the use of Activity Based Costing (ABC). ABC is an essential method in a high variety, product-customizing environment, to facilitate rapid response to externally caused change. This paper describes the work being done in the Mission Operations Division to re-engineer the development and operation life cycle of Mission Operations Centers using these tools.

  2. Virtual Immunology: Software for Teaching Basic Immunology

    Science.gov (United States)

    Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio

    2013-01-01

    As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available…

  3. Chemical Reactive Anchoring Lipids with Different Performance for Cell Surface Re-engineering Application.

    Science.gov (United States)

    Vabbilisetty, Pratima; Boron, Mallorie; Nie, Huan; Ozhegov, Evgeny; Sun, Xue-Long

    2018-02-28

    Introduction of selectively chemical reactive groups at the cell surface enables site-specific cell surface labeling and modification opportunity, thus facilitating the capability to study the cell surface molecular structure and function and the molecular mechanism it underlies. Further, it offers the opportunity to change or improve a cell's functionality for interest of choice. In this study, two chemical reactive anchor lipids, phosphatidylethanolamine-poly(ethylene glycol)-dibenzocyclooctyne (DSPE-PEG 2000 -DBCO) and cholesterol-PEG-dibenzocyclooctyne (CHOL-PEG 2000 -DBCO) were synthesized and their potential application for cell surface re-engineering via lipid fusion were assessed with RAW 264.7 cells as a model cell. Briefly, RAW 264.7 cells were incubated with anchor lipids under various concentrations and at different incubation times. The successful incorporation of the chemical reactive anchor lipids was confirmed by biotinylation via copper-free click chemistry, followed by streptavidin-fluorescein isothiocyanate binding. In comparison, the cholesterol-based anchor lipid afforded a higher cell membrane incorporation efficiency with less internalization than the phospholipid-based anchor lipid. Low cytotoxicity of both anchor lipids upon incorporation into the RAW 264.7 cells was observed. Further, the cell membrane residence time of the cholesterol-based anchor lipid was evaluated with confocal microscopy. This study suggests the potential cell surface re-engineering applications of the chemical reactive anchor lipids.

  4. Chemical Reactive Anchoring Lipids with Different Performance for Cell Surface Re-engineering Application

    Science.gov (United States)

    2018-01-01

    Introduction of selectively chemical reactive groups at the cell surface enables site-specific cell surface labeling and modification opportunity, thus facilitating the capability to study the cell surface molecular structure and function and the molecular mechanism it underlies. Further, it offers the opportunity to change or improve a cell’s functionality for interest of choice. In this study, two chemical reactive anchor lipids, phosphatidylethanolamine–poly(ethylene glycol)–dibenzocyclooctyne (DSPE–PEG2000–DBCO) and cholesterol–PEG–dibenzocyclooctyne (CHOL–PEG2000–DBCO) were synthesized and their potential application for cell surface re-engineering via lipid fusion were assessed with RAW 264.7 cells as a model cell. Briefly, RAW 264.7 cells were incubated with anchor lipids under various concentrations and at different incubation times. The successful incorporation of the chemical reactive anchor lipids was confirmed by biotinylation via copper-free click chemistry, followed by streptavidin-fluorescein isothiocyanate binding. In comparison, the cholesterol-based anchor lipid afforded a higher cell membrane incorporation efficiency with less internalization than the phospholipid-based anchor lipid. Low cytotoxicity of both anchor lipids upon incorporation into the RAW 264.7 cells was observed. Further, the cell membrane residence time of the cholesterol-based anchor lipid was evaluated with confocal microscopy. This study suggests the potential cell surface re-engineering applications of the chemical reactive anchor lipids. PMID:29503972

  5. BUSINESS BUZZWORDS: RIGHTSIZING, DOWNSIZING, RE-ENGINEERING, DE-LAYERING

    Directory of Open Access Journals (Sweden)

    Pop Anamaria Mirabela

    2011-07-01

    Full Text Available The paper attempts to analyse the rise and use of a new vocabulary (economic buzzwords related to staff dismissal in the new economy of the world. In this new economy, the organizational boundaries between states and firms become unclear and a new vocabulary has been conceived in order to express the changes the firms are undergoing. The new rhetoric includes buzzwords like privatization, de-regulation, re-engineering, rightsizing, downsizing, de-layering, quality service or global sourcing. The research is based on the conclusions of bibliographical and direct research of the literature relevant in the field, trying to emphasise the importance of strategic language when it comes to human resources management. Concepts like freedom of speech, politically correct language or non-discriminatory language are brought to attention and analysed focusing on their importance during periods of change and uncertainty characterising the economic environment nowadays. Two trends are depicted in the paper: the first is that of the supporters of political correctness who attempt to homogenize the language and thought to enhance the self-esteem of minorities. One approach to reaching this goal is to eliminate discriminatory or offensive words and phrases and the substitutions of harmless vocabulary at the expense of economy, clarity, and logic. Another approach is to deconstruct a word or phrase into its component parts, treat the component parts as wholes, and focus on secondary meanings of the component parts. On the other hand, reflecting upon the nature of large-scale organizational restructuring, there are the critics arguing that this type of language is a euphemistic form of phraseology. The analysis starts with the assumption that the economic lexis is not a rigid system of terms. Morphologically, there is a high degree of variety in productive types of compounding which exceeds the possibilities that exist in the common English vocabulary. In this

  6. Downsizing, reengineering, and restructuring: long-term implications for healthcare organizations.

    Science.gov (United States)

    Leatt, P; Baker, G R; Halverson, P K; Aird, C

    1997-01-01

    This article provides a framework for analyzing how downsizing and reengineering have affected healthcare organizations. These approaches are reviewed, and key tools that have been used, such as across-the-board cuts, reorganizing, and redesigning, are described. Examples are drawn from healthcare as well as other business sectors. The consequences of cost reduction strategies for an organizations's performance in terms of costs, quality of services, and satisfaction of consumers and employees are explored. The case is made that an organization's context--that is, its culture, level of trust, and leadership--is an important factor that influences the effect of cost-cutting strategies. Characteristics of organizations where downsizing has a better chance of succeeding also are described.

  7. Finding Security Patterns to Countermeasure Software Vulnerabilities

    OpenAIRE

    Borstad, Ole Gunnar

    2008-01-01

    Software security is an increasingly important part of software development as the risk from attackers is constantly evolving through increased exposure, threats and economic impact of security breaches. Emerging security literature describes expert knowledge such as secure development best practices. This knowledge is often not applied by software developers because they lack security awareness, security training and secure development methods and tools. Existing methods and tools require to...

  8. Fifteen years of Superfund at South Valley: Reengineering required

    International Nuclear Information System (INIS)

    Cormier, J.; Horak, F.

    1995-01-01

    It is no surprise to many of Superfund's practitioners that the law and its application are flawed. The South Valley Superfund Site in Albuquerque, New Mexico has not escaped Superfund's problems. The problems and issues arising out of the South Valley Superfund site have spurred the desire to seek a better way to administer and manage cleanup. This new method applies organizational and role changes that bring Superfund closer to an efficient business-like entity. This ''Reengineered'' Superfund strives for reorganization, contractor reduction, improved communication, reporting reduction, and teaming. In addition, modifications are made to the roles of regulators, potentially responsible parties (PRPs), and the public. Today the site encompasses roughly one square mile in area, includes six identified contaminant sources, and deals with solvent and petroleum by-product contamination

  9. Investigating interoperability of the LSST data management software stack with Astropy

    Science.gov (United States)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  10. Interactive scalable condensation of reverse engineered UML class diagrams for software comprehension

    NARCIS (Netherlands)

    Osman, Mohd Hafeez Bin

    2015-01-01

    Software design documentation is a valuable aid in software comprehension. However, keeping the software design up-to-date with evolving source code is challenging and time-consuming. Reverse engineering is one of the options for recovering software architecture from the implementation code.

  11. IDC Re-Engineering Phase 2 System Specification Document Version 1.5

    Energy Technology Data Exchange (ETDEWEB)

    Satpathi, Meara Allena [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, John F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Harris, James M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide data and products.

  12. The Evolving Role of Open Source Software in Medicine and Health Services

    Directory of Open Access Journals (Sweden)

    Sevket Seref Arikan

    2013-01-01

    Full Text Available The past five decades have witnessed immense coevolution of methods and tools of information technology, and their practical and experimental application within the medical and healthcare domain. Healthcare itself continues to evolve in response to change in healthcare needs, progress in the scientific foundations of treatments, and in professional and managerial organization of affordable and effective services, in which patients and their families and carers increasingly participate. Taken together, these trends impose highly complex underlying challenges for the design, development, and sustainability of the quality of supporting information services and software infrastructure that are needed. The challenges are multidisciplinary and multiprofessional in scope, and they require deeper study and learning to inform policy and promote public awareness of the problems health services have faced in this area for many years. The repeating pattern of failure to live up to expectations of policy-driven national health IT initiatives has proved very costly and remains frustrating and unproductive for all involved. In this article, we highlight the barriers to progress and discuss the dangers of pursuing a standardization framework devoid of empirical testing and iterative development. We give the example of the openEHR Foundation, which was established at University College London (UCL in London, England, with members in 80 countries. The Foundation is a not-for-profit company providing open specifications and working for generic standards for electronic records, informed directly by a wide range of implementation experience. We also introduce the Opereffa open source framework, which was developed at UCL based on these specifications and which has been downloaded in some 70 countries. We argue that such an approach is now essential to support good discipline, innovation, and governance at the heart of medicine and health services, in line with the

  13. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  14. Systematic profiling to monitor and specify the software refactoring process of the LHCb experiment

    CERN Document Server

    Couturier, Ben; Lohn, Stefan B

    2014-01-01

    The LHCb upgrade program implies a significant increase in data processing that will not be matched by additional computing resources. Furthermore, new architectures such as many-core platforms can currently not be fully exploited due to memory and I/O bandwidth limitations. A considerable refactoring effort will therefore be needed to vectorize and parallelize the LHCb software, to minimize hotspots and to reduce the impact of bottlenecks. It is crucial to guide refactoring with a profiling system that gives hints to regions in source-code for possible and necessary re-engineering and which kind of optimization could lead to final success. Software optimization is a sophisticated process where all parts, compiler, operating system, external libraries and chosen hardware play a role. Intended improvements can have different effects on different platforms. To obtain precise information of the general performance, to make profiles comparable, reproducible and to verify the progress of performance in the framewo...

  15. Evaluating the Implementation of the Re-Engineering Systems of Primary Care Treatment in the Military (RESPECT-Mil)

    Science.gov (United States)

    Wong, Eunice C.; Jaycox, Lisa H.; Ayer, Lynsay; Batka, Caroline; Harris, Racine; Naftel, Scott; Paddock, Susan M.

    2015-01-01

    Abstract A RAND team conducted an independent implementation evaluation of the Re-Engineering Systems of Primary Care Treatment in the Military (RESPECT-Mil) Program, a system of care designed to screen, assess, and treat posttraumatic stress disorder and depression among active duty service members in the Army's primary care settings. Evaluating the Implementation of the Re-Engineering Systems of Primary Care Treatment in the Military (RESPECT-Mil) presents the results from RAND's assessment of the implementation of RESPECT-Mil in military treatment facilities and makes recommendations to improve the delivery of mental health care in these settings. Analyses were based on existing program data used to monitor fidelity to RESPECT-Mil across the Army's primary care clinics, as well as discussions with key stakeholders. During the time of the evaluation, efforts were under way to implement the Patient Centered Medical Home, and uncertainties remained about the implications for the RESPECT-Mil program. Consideration of this transition was made in designing the evaluation and applying its findings more broadly to the implementation of collaborative care within military primary care settings. PMID:28083389

  16. IAEA safeguards information system re-engineering project (IRP)

    International Nuclear Information System (INIS)

    Whitaker, G.; Becar, J.-M.; Ifyland, N.; Kirkgoeze, R.; Koevesd, G.; Szamosi, L.

    2007-01-01

    The Safeguards Information System Re-engineering Project (IRP) was initiated to assist the IAEA in addressing current and future verification and analysis activities through the establishment of a new information technology framework for strengthened and integrated safeguards. The Project provides a unique opportunity to enhance all of the information services for the Department of Safeguards and will require project management 'best practices' to balance limited funds, available resources and Departmental priorities. To achieve its goals, the Project will require the participation of all stakeholders to create a comprehensive and cohesive plan that provides both a flexible and stable foundation for address changing business needs. The expectation is that high quality integrated information systems will be developed that incorporate state-of-the-art technical architectural standards, improved business processes and consistent user interfaces to store various data types in an enterprise data repository which is accessible on-line in a secure environment. (author)

  17. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  18. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    Science.gov (United States)

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  19. Re-Engineering Vocational and Technical Education (VTE) for Sustainable Development in North Central Geo-Political Zone, Nigeria

    Science.gov (United States)

    Sofoluwe, Abayomi Olumade

    2013-01-01

    The purpose of the study is to re-engineer vocational and technical education for sustainable development in the North Central Geo-Political Zone in Nigeria. The research design adopted was a survey inferential type. Stratified random was used to select 36 schools out of 98 schools while 920 students out of 3680 students were sampled. The data…

  20. Systems, methods and apparatus for developing and maintaining evolving systems with software product lines

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.

  1. A Change Impact Analysis to Characterize Evolving Program Behaviors

    Science.gov (United States)

    Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua

    2012-01-01

    Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks

  2. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  3. ORNL engineering design and construction reengineering report

    Energy Technology Data Exchange (ETDEWEB)

    McNeese, L.E.

    1998-01-01

    A team composed of individuals representing research and development (R and D) divisions, infrastructure support organizations, and Department of Energy (DOE)-Oak Ridge Operations was chartered to reengineer the engineering, design, and construction (ED and C) process at Oak Ridge National Laboratory (ORNL). The team recognized that ED and C needs of both R and D customers and the ORNL infrastructure program have to be met to maintain a viable and competitive national laboratory. Their goal was to identify and recommend implementable best-in-class ED and C processes that will efficiently and cost-effectively support the ORNL R and D staff by being responsive to their programmatic and infrastructure needs. The team conducted process mapping of current and potential ED and C approaches, developed idealized versions of ED and C processes, and identified potential barriers to an efficient ED and C process. Eight subteams were assigned to gather information and to evaluate the significance of potential barriers through benchmarking, surveys, interviews, and reviews of key topical areas in order to determine whether the perceived barriers were real and important and whether they resulted from laws or regulations over which ORNL has no control.

  4. Preparing for the future: a case study of role changing and reengineering. Recognize and seize the new opportunities.

    Science.gov (United States)

    Holland, C A

    1995-01-01

    Today's laboratory managers are caught in the midst of a tumultuous environment as a result of managed care, mergers and acquisitions, and downsizing. We must prepare ourselves through continuous learning, recognize the marketable value of our skills outside of the laboratory, and seize opportunities to expand into new roles. At Arkansas Children's Hospital, the Chief Executive Officer selected the Administrative Director of Laboratories to reengineer the General Pediatric Center. Our goals were to improve quality of care, efficiency, teamwork, clinic visit times, and satisfaction of patients, staff, and physicians. We developed ideal objectives from surveys, brainstorming sessions, and interviews to serve as guidelines for reengineering teams. Teams met the goals and 12 of 15 ideal objectives. Patient flow redesign resulted in different processes for different patient populations and a 35% decrease in the average clinic visit time. Patient, staff, and physician satisfaction improved, as did the clinic's financial status. The project's success confirms that our leadership and analytical skills are transferable from the laboratory to carry us to new heights in other health-care arenas.

  5. Evolving Capabilities for Virtual Globes

    Science.gov (United States)

    Glennon, A.

    2006-12-01

    Though thin-client spatial visualization software like Google Earth and NASA World Wind enjoy widespread popularity, a common criticism is their general lack of analytical functionality. This concern, however, is rapidly being addressed; standard and advanced geographic information system (GIS) capabilities are being developed for virtual globes--though not centralized into a single implementation or software package. The innovation is mostly originating from the user community. Three such capabilities relevant to the earth science, education, and emergency management communities are modeling dynamic spatial phenomena, real-time data collection and visualization, and multi-input collaborative databases. Modeling dynamic spatial phenomena has been facilitated through joining virtual globe geometry definitions--like KML--to relational databases. Real-time data collection uses short scripts to transform user-contributed data into a format usable by virtual globe software. Similarly, collaborative data collection for virtual globes has become possible by dynamically referencing online, multi-person spreadsheets. Examples of these functions include mapping flows within a karst watershed, real-time disaster assessment and visualization, and a collaborative geyser eruption spatial decision support system. Virtual globe applications will continue to evolve further analytical capabilities, more temporal data handling, and from nano to intergalactic scales. This progression opens education and research avenues in all scientific disciplines.

  6. Next-generation business intelligence software with Silverlight 3

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  7. Addressing Software Security

    Science.gov (United States)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  8. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  9. Evolving a Simulation Model Product Line Software Architecture from Heterogeneous Model Representations

    National Research Council Canada - National Science Library

    Greaney, Kevin

    2003-01-01

    .... Many of these large-scale, software-intensive simulation systems were autonomously developed over time, and subject to varying degrees of funding, maintenance, and life-cycle management practices...

  10. Post-Modern Software Development

    Science.gov (United States)

    Filman, Robert E.

    2005-01-01

    The history of software development includes elements of art, science, engineering, and fashion(though very little manufacturing). In all domains, old ideas give way or evolve to new ones: in the fine arts, the baroque gave way to rococo, romanticism, modernism, postmodernism, and so forth. What is the postmodern programming equivalent? That is, what comes after object orientation?

  11. Designing Process Improvement of Finished Good On Time Release and Performance Indicator Tool in Milk Industry Using Business Process Reengineering Method

    Science.gov (United States)

    Dachyar, M.; Christy, E.

    2014-04-01

    To maintain position as a major milk producer, the Indonesian milk industry should do some business development with the purpose of increasing customer service level. One strategy is to create on time release conditions for finished goods which will be distributed to customers and distributors. To achieve this condition, management information systems of finished goods on time release needs to be improved. The focus of this research is to conduct business process improvement using Business Process Reengineering (BPR). The deliverable key of this study is a comprehensive business strategy which is the solution of the root problems. To achieve the goal, evaluation, reengineering, and improvement of the ERP system are conducted. To visualize the predicted implementation, a simulation model is built by Oracle BPM. The output of this simulation showed that the proposed solution could effectively reduce the process lead time and increase the number of quality releases.

  12. Organizational learning as a test-bed for business process reengineering

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Leinsdorff, Torben

    1998-01-01

    The fact that a company's learning ability may prevent strategic drift and the fact that many companies are undertaking BPR (business process reengineering) projects leads us to ask whether all these BPR activities promote organizational learning. Within this framework, we studied the extent...... of Enzyme Business, Novo Nordisk A/S. The result of the analysis is that a correlation between BPR and organizational learning has been established, i.e. the BPR elements: customer focus, process orientation, high level of ambition, clean sheet principle, performance measuring, the business system diamond...... to which BPR promotes organizational learning by focusing on the project group and the steering committee. This paper is based partly on a theoretical study of the significant characteristics of BPR and of organizational learning and partly on a field study carried out in cooperation with the business unit...

  13. Reengineering NHS Hospitals in Greece: Redistribution Leads to Rational Mergers.

    Science.gov (United States)

    Nikolentzos, Athanasios; Kontodimopoulos, Nick; Polyzos, Nikolaos; Thireos, Eleftherios; Tountas, Yannis

    2015-03-18

    The purpose of this study was to record and evaluate existing public hospital infrastructure of the National Health System (NHS), in terms of clinics and laboratories, as well as the healthcare workforce in each of these units and in every health region in Greece, in an attempt to optimize the allocation of these resources. An extensive analysis of raw data according to supply and performance indicators was performed to serve as a solid and objective scientific baseline for the proposed reengineering of the Greek public hospitals. Suggestions for "reshuffling" clinics and diagnostic laboratories, and their personnel, were made by using a best versus worst outcome indicator approach at a regional and national level. This study is expected to contribute to the academic debate about the gap between theory and evidence based decision-making in health policy.

  14. Reconfigurable network systems and software-defined networking

    OpenAIRE

    Zilberman, N.; Watts, P. M.; Rotsos, C.; Moore, A. W.

    2015-01-01

    Modern high-speed networks have evolved from relatively static networks to highly adaptive networks facilitating dynamic reconfiguration. This evolution has influenced all levels of network design and management, introducing increased programmability and configuration flexibility. This influence has extended from the lowest level of physical hardware interfaces to the highest level of network management by software. A key representative of this evolution is the emergence of software-defined n...

  15. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  16. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Schramm, Joachim; Dohrmann, Patrick; Kuhrmann, Marco

    2015-01-01

    families of processes and, as part of this, variability operations provide means to modify and reuse pre-defined process assets. Objective: Our goal is to evaluate the feasibility of variability operations to support the development of flexible software process lines. Method: We conducted a longitudinal......Context: Software processes evolve over time and several approaches were proposed to support the required flexibility. Yet, little is known whether these approaches sufficiently support the development of large software processes. A software process line helps to systematically develop and manage...

  17. Critical evaluation of reverse engineering tool Imagix 4D!

    Science.gov (United States)

    Yadav, Rashmi; Patel, Ravindra; Kothari, Abhay

    2016-01-01

    The comprehension of legacy codes is difficult to understand. Various commercial reengineering tools are available that have unique working styles, and are equipped with their inherent capabilities and shortcomings. The focus of the available tools is in visualizing static behavior not the dynamic one. Therefore, it is difficult for people who work in software product maintenance, code understanding reengineering/reverse engineering. Consequently, the need for a comprehensive reengineering/reverse engineering tool arises. We found the usage of Imagix 4D to be good as it generates the maximum pictorial representations in the form of flow charts, flow graphs, class diagrams, metrics and, to a partial extent, dynamic visualizations. We evaluated Imagix 4D with the help of a case study involving a few samples of source code. The behavior of the tool was analyzed on multiple small codes and a large code gcc C parser. Large code evaluation was performed to uncover dead code, unstructured code, and the effect of not including required files at preprocessing level. The utility of Imagix 4D to prepare decision density and complexity metrics for a large code was found to be useful in getting to know how much reengineering is required. At the outset, Imagix 4D offered limitations in dynamic visualizations, flow chart separation (large code) and parsing loops. The outcome of evaluation will eventually help in upgrading Imagix 4D and posed a need of full featured tools in the area of software reengineering/reverse engineering. It will also help the research community, especially those who are interested in the realm of software reengineering tool building.

  18. The evolving marriage of hardware and software, as seen from the openlab perspective

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk will give an overview of the activities of the openlab Platform Competence Center, collaborating with Intel. The problem of making hardware and software talk to each other efficiently has been around since the concept of computers ever came up, and current times are no different. We will report on the related R&D activities of the openlab PCC, touching on topics ranging from hardware platforms, through compilers, to next-generation physics software. We will also relate to relevant practice in the industry, which made significant progress in the last decade.

  19. Tracking code patterns over multiple software versions with Herodotos

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Lawall, Julia; Muller, Gilles

    2010-01-01

    An important element of understanding a software code base is to identify the repetitive patterns of code it contains and how these evolve over time. Some patterns are useful to the software, and may be modularized. Others are detrimental to the software, such as patterns that represent defects...... pattern occurrences over multiple versions of a software project, independent of other changes in the source files. Guided by a user-provided configuration file, Herodotos builds various graphs showing the evolution of the pattern occurrences and computes some statistics. We have evaluated this approach...

  20. Storage system software solutions for high-end user needs

    Science.gov (United States)

    Hogan, Carole B.

    1992-01-01

    Today's high-end storage user is one that requires rapid access to a reliable terabyte-capacity storage system running in a distributed environment. This paper discusses conventional storage system software and concludes that this software, designed for other purposes, cannot meet high-end storage requirements. The paper also reviews the philosophy and design of evolving storage system software. It concludes that this new software, designed with high-end requirements in mind, provides the potential for solving not only the storage needs of today but those of the foreseeable future as well.

  1. Dynamic Capabilities and Project Management in Small Software Companies

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob; Nielsen, Peter Axel; Persson, John Stouby

    2017-01-01

    A small software company depends on its capability to adapt to rapid technological and other changes in its environment—its dynamic capabilities. In this paper, we argue that to evolve and maintain its dynamic capabilities a small software company must pay attention to the interaction between...... dynamic capabilities at different levels of the company — particularly between the project management and the company levels. We present a case study of a small software company and show how successful dynamic capabilities at the company level can affect project management in small software companies...

  2. The Battle for the Soul of Management Denmark-The Shaping of the Danish Versions of Business Process Reengineering

    DEFF Research Database (Denmark)

    Koch, Christian; Vogelius, Peter

    1997-01-01

    Managerial theory distilled into tidy concepts is continually and almost ritually launched into the international and national management audiences. The paper discuss the contemporary exemplar of such management concepts: Business Process Reengineering (BPR). By taking the management fads serious...... to the concept. And on the other hand the need for the consultants to differentiate their product (the concept) from others available on the market....

  3. Business Process Reengineering Of Funding On Indonesia’s Islamic Banks

    Directory of Open Access Journals (Sweden)

    Aslam Mei Nur Widigdo

    2016-02-01

    Full Text Available This research attempts to analyze the value chain of Islamic banking business processes and to develop a business processes model on depositors’ funds in order to improve the performance of Islamic banks. Four models of Islamic banking operating in Indonesia are used as the objects of the study. This research applies qualitative study (exploratory approach and utilizes primary data obtained from questionnaire and interviews. This data are then processed by value stream mapping and process activity mapping. This study shows that the waiting time for services is the sub-stage of business process that does not have value added and categorized as pure waste based on VSM criteria.The reengineering of business process of the third party fundraising may reduce collection time up to 1490 minutes for corporate customer and 22 minutes for individual customer.DOI: 10.15408/aiq.v8i1.2506

  4. Lightweight Methods for Effective Verification of Software Product Lines with Off-the-Shelf Tools

    DEFF Research Database (Denmark)

    Iosif-Lazar, Alexandru Florin

    Certification is the process of assessing the quality of a product and whether it meets a set of requirements and adheres to functional and safety standards. I is often legally required to provide guarantee for human safety and to make the product available on the market. The certification process...... relies on objective evidence of quality, which is produced by using qualified and state-of-the-art tools and verification and validation techniques. Software product line (SPL) engineering distributes costs among similar products that are developed simultaneously. However, SPL certification faces major...... SPL reengineering projects that involve complex source code transformations. To facilitate product (re)certification, the transformation must preserve certain qualitative properties such as code structure and semantics—a difficult task due to the complexity of the transformation and because certain...

  5. Rural district hospitals - essential cogs in the district health system - and primary healthcare re-engineering.

    Science.gov (United States)

    le Roux, K W D P; Couper, I

    2015-06-01

    The re-engineering of primary healthcare (PHC) is regarded as an essential precursor to the implementation of National Health Insurance in South Africa, but improvements in the provision of PHC services have been patchy. The authors contend that the role of well- functioning rural district hospitals as a hub from which PHC services can be most efficiently managed has been underestimated, and that the management of district hospitals and PHC clinics need to be co-located at the level of the rural district hospital, to allow for proper integration of care and effective healthcare provision.

  6. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    Science.gov (United States)

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  7. Improving the Customer Configuration Update Process by Explicitly Managing Software Knowledge

    NARCIS (Netherlands)

    Slinger, S.R.L.

    2006-01-01

    The implementation and continuous support of a software product at a customer with evolving requirements is a complex task for a product software vendor. There are many customers for the vendor to serve, all of whom might require their own version or variant of the application. Furthermore, the

  8. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    of the development process. The main contributions presented in the thesis have evolved from work with two of the hypotheses: These address the problems of management of evolution, and overview, comprehension and navigation respectively. The first main contribution is the Architectural Software Configuration...... Management Model: A software configuration management model where the abstractions and hierarchy of the logical aspect of software architecture forms the basis for version control and configuration management. The second main contribution is the Geographic Space Architecture Visualisation Model......: A visualisation model where entities in a software architecture are organised geographically in a two-dimensional plane, their visual appearance determined by processing a subset of the data in the entities, and interaction with the project's underlying data performed by direct manipulation of the landscape...

  9. Managing your practice's first impression: the process of front-desk reengineering.

    Science.gov (United States)

    Walsh, Alison L

    2004-01-01

    Patients must be regarded as consumers. As such, they are increasingly informed, questioning, cost-conscious, technologically savvy, and demanding. Just as health plans have developed defined contribution products that offer consumers more control over how and where their health-care dollars are spent, practice success is linked to reengineering office operations to offer consumers and patients greater choice, control, autonomy, and service. Patients and consumers want practices that deliver clinical and business services that meet the criteria of reliability, effciency, service offerings, patient focus, enthusiasm, customization, and trust. Physician practices must also take care to avoid destructive and disruptive behaviors and conditions such as noise, interference, excessive repetition, long waits, appointment delays, and staff rudeness. A successful patient-focused practice emerges when physicians and office staff begin to look at the clinical and service experience through the patient's eyes.

  10. El CAD en la actividad de reingeniería e ingeniería en los mantenimientos a centrales eléctricas // CAD in the reengineering and engineering activity in maintenance of power plants

    Directory of Open Access Journals (Sweden)

    R. García Ramírez

    2000-07-01

    Full Text Available El presente trabajo muestra algunas experiencias obtenidas en la actividad de ingeniería y reingeniería durante elmantenimiento a centrales eléctricas con empleo del CAD (Computer Aided Design, se muestran además las estrategiasseguidas con vistas a automatizar la actividad de reingeniería en ordenadores y a lograr mejoras económicas en la actividada costa de disminuir los costos de producción.Palabras claves: CAD, reingeniería, mantenimiento de calderas._________________________________________________________________________AbstractThe present work shows some experiences obtained in the engineering and reengineering during the maintenance activity inpower plants carried out by our company applying the CAD (Computer Aided Design, it is also exposed the strategyfollowed with a view to automating the reengineering activity with the use of computers, keeping in mind a view to achieveconomic improvements in the activity to diminish production costs.Key words: reengineering, CAD, maintenance, boiler.

  11. Implementation of 5S tools as a starting point in business process reengineering

    Directory of Open Access Journals (Sweden)

    Vorkapić Miloš 0000-0002-3463-8665

    2017-01-01

    Full Text Available The paper deals with the analysis of elements which represent a starting point in implementation of a business process reengineering. We have used Lean tools through the analysis of 5S model in our research. On the example of finalization of the finished transmitter in IHMT-CMT production, 5S tools were implemented with a focus on Quality elements although the theory shows that BPR and TQM are two opposite activities in an enterprise. We wanted to distinguish the significance of employees’ self-discipline which helps the process of product finalization to develop in time and without waste and losses. In addition, the employees keep their work place clean, tidy and functional.

  12. A Smart Mobile Lab-on-Chip-Based Medical Diagnostics System Architecture Designed For Evolvability

    DEFF Research Database (Denmark)

    Patou, François; Dimaki, Maria; Svendsen, Winnie Edith

    2015-01-01

    for this work. We introduce a smart-mobile and LoC-based system architecture designed for evolvability. By propagating LoC programmability, instrumentation, and control tools to the highlevel abstraction smart-mobile software layer, our architecture facilitates the realisation of new use...

  13. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  14. Development of a software for the design of custom-made hip prostheses using an open-source rapid application development environment.

    Science.gov (United States)

    Viceconti, M; Testi, D; Gori, R; Zannoni, C

    2000-01-01

    The present work describes a technology transfer project called HIPCOM devoted to the re-engineering of the process used by a medical devices manufacturer to design custom-made hip prostheses. Although it started with insufficient support from the end-user management, a very tight scheduling and a moderate budget, the project developed into what is considered by all partners a success story. In particular, the development of the design software, called HIPCOM Interactive Design Environment (HIDE) was completed in a time shorter than any optimistic expectation. The software was quite stable since its first beta version, and once introduced at the user site it fully replaced the original procedure in less than two months. One year after the early adoption, more than 80 custom-made prostheses had been designed with HIDE and the user had reported only two bugs, both cosmetics. The scope of the present work was to report the development experience and to investigate the reasons for these positive results, with particular reference to the development procedure and the software architecture. The choice of TCL/TK as development language and the adoption of well-defined software architecture were found to be the success key factors. Other important determinants were found to be the adoption of an incremental software engineering strategy, well suited for small to medium projects and the presence in the development staff of a technology transfer expert.

  15. Intrinsic Motivation in Open Source Software Development

    DEFF Research Database (Denmark)

    Bitzer, J.; W., Schrettl,; Schröder, Philipp

    2004-01-01

    This papers sheds light on the puzzling evidence that even though open source software (OSS) is a public good, it is developed for free by highly qualified, young and motivated individuals, and evolves at a rapid pace. We show that once OSS development is understood as the private provision...

  16. Swarming Robot Design, Construction and Software Implementation

    Science.gov (United States)

    Stolleis, Karl A.

    2014-01-01

    In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.

  17. Reengineering of Permanent Mould Casting with Lean Manufacturing Methods

    Directory of Open Access Journals (Sweden)

    R. Władysiak

    2007-07-01

    Full Text Available At the work were introduced main areas of production system project of casts produced in permanent moulds, that constitutes reengineering of conventional production system according to Lean Manufacturing (LM methods. New resolution of cooling of dies with water mist was shown to casting of car wheels made from aluminium alloys in low pressure casting process. It was implemented as a part of goal-oriented project in R.H. Alurad Sp.z o.o. in Gorzyce. Its using intensifies solidification and self-cooling of casts shortening the time of casting cycle by the 30%. It was described reorganizing casting stations into multi-machines cells production and the process of their fast tool’s exchange with applying the SMED method. A project of the system was described controlling the production of the foundry with the computer aided light Kanban system. A visualization of the process was shown the production of casts with use the value stream mapping method. They proved that applying casting new method in the technology and LM methods allowed to eliminate down-times, to reduce the level of stocks, to increase the productivity and the flow of the castings production.

  18. Construction of RNA nanocages by re-engineering the packaging RNA of Phi29 bacteriophage

    Science.gov (United States)

    Hao, Chenhui; Li, Xiang; Tian, Cheng; Jiang, Wen; Wang, Guansong; Mao, Chengde

    2014-05-01

    RNA nanotechnology promises rational design of RNA nanostructures with wide array of structural diversities and functionalities. Such nanostructures could be used in applications such as small interfering RNA delivery and organization of in vivo chemical reactions. Though having impressive development in recent years, RNA nanotechnology is still quite limited and its programmability and complexity could not rival the degree of its closely related cousin: DNA nanotechnology. Novel strategies are needed for programmed RNA self-assembly. Here, we have assembled RNA nanocages by re-engineering a natural, biological RNA motif: the packaging RNA of phi29 bacteriophage. The resulting RNA nanostructures have been thoroughly characterized by gel electrophoresis, cryogenic electron microscopy imaging and dynamic light scattering.

  19. Reverse engineering for quality systems

    International Nuclear Information System (INIS)

    Nolan, A.J.

    1995-01-01

    When the age of software engineering began, many companies were faced with a problem of how to support the older, pre-software-engineering, programs. The techniques of reverse engineering and re-engineering were developed to bridge the gap between the past and the present. Although reverse engineering can be used for generating missing documentation, it can also be used as a means to demonstrate quality in these older programs. This paper presents, in the form of a case study, how Rolls-Royce and Associates Limited addressed the quality issues of reverse engineering and re-engineering. (author)

  20. Re-engineering of Bacterial Luciferase; For New Aspects of Bioluminescence.

    Science.gov (United States)

    Kim, Da-Som; Choi, Jeong-Ran; Ko, Jeong-Ae; Kim, Kangmin

    2018-01-01

    Bacterial luminescence is the end-product of biochemical reactions catalyzed by the luciferase enzyme. Nowadays, this fascinating phenomenon has been widely used as reporter and/or sensors to detect a variety of biological and environmental processes. The enhancement or diversification of the luciferase activities will increase the versatility of bacterial luminescence. Here, to establish the strategy for luciferase engineering, we summarized the identity and relevant roles of key amino acid residues modulating luciferase in Vibrio harveyi, a model luminous bacterium. The current opinions on crystal structures and the critical amino acid residues involved in the substrate binding sites and unstructured loop have been delineated. Based on these, the potential target residues and/or parameters for enzyme engineering were also suggested in limited scale. In conclusion, even though the accurate knowledge on the bacterial luciferase is yet to be reported, the structure-guided site-directed mutagenesis approaches targeting the regulatory amino acids will provide a useful platform to re-engineer the bacterial luciferase in the future. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Computational Intelligence in Software Cost Estimation: Evolving Conditional Sets of Effort Value Ranges

    OpenAIRE

    Papatheocharous, Efi; Andreou, Andreas S.

    2008-01-01

    In this approach we aimed at addressing the problem of large variances found in available historical data that are used in software cost estimation. Project data is expensive to collect, manage and maintain. Therefore, if we wish to lower the dependence of the estimation to

  2. Reengineering water treatment units for removal of Sr-90, I-129, Tc-99, and uranium from contaminated groundwater at the DOE's Savannah River Site

    International Nuclear Information System (INIS)

    Serkiz, S.M.

    2000-01-01

    The 33 years of active operation of the F- and H-Area Seepage Basins to dispose of liquid low-level radioactive waste at the Department of Energy's Savannah River Site has resulted in the contamination of the groundwater underlying these basins with a wide variety of radionuclides and stable metals. The current Resource Conservation and Recovery Act (RCRA) Part B permit requires the operation of a pump-and-treat system capable of (1) maintaining hydraulic control of a specified contaminated groundwater plume, (2) treatment of the extracted groundwater, and (3) reinjection of the treated water hydraulically upgradient of the basins. Two multimillion-dollar water treatment units (WTUs) were designed and built in 1997 and the basic design consists of (1) reverse osmosis concentration, (2) chemical addition, neutralization, precipitation, polymer addition, flocculation, and clarification of the reverse osmosis concentrate, and (3) final polishing of the clarified water by ion exchange (IX) and sorption. During startup of these units numerous process optimizations were identified and, therefore, the WTUs have been recently reengineered. A systematic approach of: (1) developing a technical baseline through laboratory studies, (2) scale-up and plant testing, (3) plant modification, and (4) system performance monitoring was the basis for reengineering the WTUs. Laboratory experiments were conducted in order to establish a technical baseline for further scale-up/plant testing and system modifications. These studies focused on the following three areas of the process: (1) contaminant removal during chemical addition, neutralization and precipitation, (2) solids separation by flocculation, coagulation, clarification, and filtration, and (3) contaminant polishing of the clarified liquid by IX/sorption. Using standard laboratory-scale jar tests, the influences of pH and Fe concentration on contaminant removal during precipitation/neutralization were evaluated. The results of

  3. Implementation plan for waste management reengineering at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Berry, J.B.

    1997-10-01

    An intensive reengineering evaluation of the Oak Ridge National Laboratory (ORNL) waste management program was conducted from February to July 1997 resulting in the following vision for ORNL waste management: ORNL Waste Management will become an integrated Waste Management/Generator function that: (1) Treats ORNL as a single generator for expert-based waste characterization and certification purposes; (2) Recognizes Generators, Department of Energy (DOE), and the Management and Integration (M ampersand I) contractor as equally important customers; (3) Focuses on pollution prevention followed by waste generation, collection, treatment, storage, and disposal operations that reflect more cost-effective commercial approaches; and (4) Incorporates new technology and outsourcing of services where appropriate to provide the lowest cost solutions. A cross-functional Core Team recommended 15 cost-effectiveness improvements that are expected to reduce the fiscal year (FY) 1996 ORNL waste management costs of $75M by $10-$15M annually. These efficiency improvements will be realized by both Research and Waste Management Organizations

  4. Beyond the computer-based patient record: re-engineering with a vision.

    Science.gov (United States)

    Genn, B; Geukers, L

    1995-01-01

    In order to achieve real benefit from the potential offered by a Computer-Based Patient Record, the capabilities of the technology must be applied along with true re-engineering of healthcare delivery processes. University Hospital recognizes this and is using systems implementation projects, such as the catalyst, for transforming the way we care for our patients. Integration is fundamental to the success of these initiatives and this must be explicitly planned against an organized systems architecture whose standards are market-driven. University Hospital also recognizes that Community Health Information Networks will offer improved quality of patient care at a reduced overall cost to the system. All of these implementation factors are considered up front as the hospital makes its initial decisions on to how to computerize its patient records. This improves our chances for success and will provide a consistent vision to guide the hospital's development of new and better patient care.

  5. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  6. Software refactoring at the package level using clustering techniques

    KAUST Repository

    Alkhalid, A.

    2011-01-01

    Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.

  7. Crossing the borders and the cultural gaps for educating PhDs in software engineering

    DEFF Research Database (Denmark)

    Knutas, Antti; Seffah, Ahmed; Sørensen, Lene Tolstrup

    2017-01-01

    Software systems have established themselves as the heart of business and everyday living, and as the pillar of the emerging global digital economy. This puts pressure on educational institutions to train people for the continuously evolving software industry, which puts additional demand for new...

  8. Summary of the International Conference on Software and System Processes

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; O'Connor, Rory V.; Perry, Dewayne E.

    2016-01-01

    The International Conference on Software and Systems Process (ICSSP), continuing the success of Software Process Workshop (SPW), the Software Process Modeling and Simulation Workshop (ProSim) and the International Conference on Software Process (ICSP) conference series, has become the established...... premier event in the field of software and systems engineering processes. It provides a leading forum for the exchange of research outcomes and industrial best-practices in process development from software and systems disciplines. ICSSP 2016 was held in Austin, Texas, from 14-15 May 2016, co......-located with the 38th International Conference on Software Engineering (ICSE). The theme of mICSSP 2016 was studying "Process(es) in Action" by recognizing that the AS-Planned and AS-Practiced processes can be quite different in many ways including their ows, their complexity and the evolving needs of stakeholders...

  9. Development of the re-engineered European decision support system for off-site nuclear and radiological emergencies - JRODOS. Application to air pollution transport modelling

    International Nuclear Information System (INIS)

    Ievdin, I.; Treebushny, D.; Raskob, W.; Zheleznyak, M.

    2008-01-01

    Full text: The European decision support system for nuclear and radiological emergencies RODOS includes a set of numerical models simulating the transport of radionuclides in the environment, estimating potential doses to the public and simulating and evaluating the efficiency of countermeasures. The re-engineering of the RODOS system using the Java technology has started recently which will allow to apply the new system called JRODOS on nearly any computational platform running Java virtual machine. Modern software development approaches were used for the JRODOS system architecture and implementation: distributed system design (client, management server, computational server), geo-database utilization, plug-in model structure and OpenMI-like compatibility to support seamless model inter-connection. Stable open source components such as an ORM solution (Hibernate), an OpenGIS component (Geotools) and a charting/reporting component (JFree, Pentaho) were utilized to optimize the development effort and allow a fast completion of the project. The architecture of the system is presented and illustrated for the atmospheric dispersion module ALSMC (Atmospheric Local Scale Model Chain) performing calculations of atmospheric pollution transport and the corresponding acute doses and dose rates. The example application is based on a synthetic scenario of a release from a nuclear power plant located in Europe. (author)

  10. Automation of procedure writing for the RLWTF

    International Nuclear Information System (INIS)

    Farnham, M.; MacDonald, A.

    1998-01-01

    In August of 1997, the Radioactive Liquid Waste Treatment Facility (RLWTF) at Los Alamos National Laboratory (LANL) recognized the need to re-engineer document management business process. All nuclear facilities at LANL are required to ensure that both the latest approved revision of controlled documents and any changes to those documents are available to operating personnel at all times. The Nuclear Materials Technology (NMT) Division was also re-engineering its document management business processes and searching for a solution. Both groups contacted several internal and external organizations in search of potential software solutions in use that would meet requirements. This report describes the objectives and features required by the software package, the choice of Procedure Design as the software package, and its implementation

  11. Reengineering of the business process in the Serbian post's department for express parcel service

    Directory of Open Access Journals (Sweden)

    Lazarević Dragan M.

    2015-01-01

    Full Text Available In this paper the model that solves the problem of exceeding time limit in the system of express parcel shipping in the Post of Serbia is described. The existing principle of the organization of the area serving is explained, as well as the problem of exceeding time limit that appears and leads to the delay of the service to the user. Two approaches for problem solving are suggested. The reengineering of the existing business processes is carried out to some extent through these two approaches, and will be presented by BPMN notation. The first approach is based on the use of the fuzzy set theory, i.e. fuzzy logical systems, while the other one is based on the use of algorithm 'zoning-routing'.

  12. Evolvability Search: Directly Selecting for Evolvability in order to Study and Produce It

    DEFF Research Database (Denmark)

    Mengistu, Henok; Lehman, Joel Anthony; Clune, Jeff

    2016-01-01

    of evolvable digital phenotypes. Although some types of selection in evolutionary computation indirectly encourage evolvability, one unexplored possibility is to directly select for evolvability. To do so, we estimate an individual's future potential for diversity by calculating the behavioral diversity of its...... immediate offspring, and select organisms with increased offspring variation. While the technique is computationally expensive, we hypothesized that direct selection would better encourage evolvability than indirect methods. Experiments in two evolutionary robotics domains confirm this hypothesis: in both...... domains, such Evolvability Search produces solutions with higher evolvability than those produced with Novelty Search or traditional objective-based search algorithms. Further experiments demonstrate that the higher evolvability produced by Evolvability Search in a training environment also generalizes...

  13. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Re-engineering the process of medical imaging physics and technology education and training.

    Science.gov (United States)

    Sprawls, Perry

    2005-09-01

    The extensive availability of digital technology provides an opportunity for enhancing both the effectiveness and efficiency of virtually all functions in the process of medical imaging physics and technology education and training. This includes degree granting academic programs within institutions and a wide spectrum of continuing education lifelong learning activities. Full achievement of the advantages of technology-enhanced education (e-learning, etc.) requires an analysis of specific educational activities with respect to desired outcomes and learning objectives. This is followed by the development of strategies and resources that are based on established educational principles. The impact of contemporary technology comes from its ability to place learners into enriched learning environments. The full advantage of a re-engineered and implemented educational process involves changing attitudes and functions of learning facilitators (teachers) and resource allocation and sharing both within and among institutions.

  15. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  16. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  17. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    Science.gov (United States)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development

  18. Evolvable synthetic neural system

    Science.gov (United States)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  19. Status of REBUS fuel management software development for RERTR applications

    International Nuclear Information System (INIS)

    Olson, Arne P.

    2000-01-01

    The REBUS-5 burnup code has evolved substantially in order to meet the needs of the ANL RERTR Program. This paper presents a summary of the past changes and improvements in the capabilities of this software, and also identifies future plans. (author)

  20. Brooklyn Union strategy: Re-engineering from outside in

    International Nuclear Information System (INIS)

    Parker, W.P. Jr.

    1997-01-01

    Five years ago, the management at Brooklyn Union embarked on a long, hard look at the way the company conducted business. In effect, they stepped into their customers' shoes. Business Process Improvement (BPI) is designed to construct a lasting corporate culture that can help Brooklyn Union meet its stated goal of becoming the premier energy company in the Northeast. A major component of that culture involves a dedication to service and cost management that is as solid as their credit ratings. To date, the bottom line on BPI has been impressive: By 1995, the customer satisfaction rating, which had been hovering in the '80s, had shot up to 95%. The management commitment has come in the form of resources, and a willingness to put its money where its mouth is (rewards for performance). The employee buy-in has shown up in those outstanding ratings from customers and in the financial results. Changing the culture of any long-established entity is never easy, whether it be on the micro-level (a family, for instance) of the macro-level (a country). It involves issues of trust, and a certain leap of faith that the new approach will bring results. Communication and education are two of the keys to gaining that participation. The company was able to impress upon employees the need for change--in particular the need for them to begin thinking like customers. The paper discusses the implementation of this re-engineering strategy

  1. The ALMA Common Software as a Basis for a Distributed Software Development

    Science.gov (United States)

    Raffi, Gianni; Chiozzi, Gianluca; Glendenning, Brian

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe, North America and Japan. ALMA will consist of 64 12-m antennas operating in the millimetre and sub-millimetre wavelength range, with baselines of more than 10 km. It will be located at an altitude above 5000 m in the Chilean Atacama desert. The ALMA Computing group is a joint group with staff scattered on 3 continents and is responsible for all the control and data flow software related to ALMA, including tools ranging from support of proposal preparation to archive access of automatically created images. Early in the project it was decided that an ALMA Common Software (ACS) would be developed as a way to provide to all partners involved in the development a common software platform. The original assumption was that some key middleware like communication via CORBA and the use of XML and Java would be part of the project. It was intended from the beginning to develop this software in an incremental way based on releases, so that it would then evolve into an essential embedded part of all ALMA software applications. In this way we would build a basic unity and coherence into a system that will have been developed in a distributed fashion. This paper evaluates our progress after 1.5 year of work, following a few tests and preliminary releases. It analyzes the advantages and difficulties of such an ambitious approach, which creates an interface across all the various control and data flow applications.

  2. The CMS software performance at the start of data taking

    CERN Document Server

    Benelli, Gabriele

    2009-01-01

    The CMS software framework (CMSSW) is a complex project evolving very rapidly as the first LHC colliding beams approach. The computing requirements constrain performance in terms of CPU time, memory footprint and event size on disk to allow for planning and managing the computing infrastructure necessary to handle the needs of the experiment. A performance suite of tools has been developed to track all aspects of code performance, through the software release cycles, allowing for regression and guiding code development for optimization. In this talk, we describe the CMSSW performance suite tools used and present some sample performance results from the release integration process for the CMS software.

  3. Intrinsic Motivation versus Signaling in Open Source Software Development

    DEFF Research Database (Denmark)

    Bitzer, J; Schrettl, W; Schröder, P

    This papers sheds light on the puzzling fact that even though open source software (OSS) is a public good, it is developed for free by highly qualified, young, motivated individuals, and evolves at a rapid pace. We show that when OSS development is understood as the private provision of a public...

  4. A Framework for the Management of Evolving Requirements in Software Systems Supporting Network-Centric Warfare

    National Research Council Canada - National Science Library

    Reynolds, Linda K

    2006-01-01

    .... There are many sources of requirements for these software systems supporting NCO, which may increase in number as the Services continue to develop the capabilities necessary for the transformation...

  5. Re-engineering the nuclear medicine residency curriculum in the new era of PET imaging: Perspectives on PET education and training in the Philippine context

    International Nuclear Information System (INIS)

    Pascual, T.N.; Santiago, J.F.; Leus, M.

    2007-01-01

    Full text: There is rapid development in PET Imaging and Molecular Nuclear Medicine. In the context of a residency training program, there is a need to incorporate these technologies in the existing Nuclear Medicine Residency Training Curriculum. This will ensure that trainees are constantly updated with the latest innovations in Nuclear Medicine making them apply this progress in their future practice hence making them achieve the goals and objectives of the curriculum. In residency training programs wherein no PET facilities are existing, these may be remedied by re-engineering the curriculum to include mandatory /electives rotations to other hospitals where the facilities are available. In order to ensure the integrity of the training program in this process of development, a proper sequence of this re-engineering process adhering to educational principles is suggested. These steps reflect the adoption of innovations and developments in the field of Nuclear Medicine essential for nuclear medicine resident learning. Curriculum re-engineering is a scientific and logical method reflecting the processes of addressing changes in the curriculum in order to deliver the desired goals and objectives of the program as dictated by time and innovations. The essential steps in this curriculum re-engineering process, which in this case aim to incorporate and/or update PET Imaging and Molecular Nuclear Imaging education and training, include (1) Curriculum Conceptualization and Legitimatisation, (2) Curriculum Diagnosis, (3) Curriculum Engineering, Designing and Organization, (4) Curriculum Implementation, (5) Curriculum Evaluation, (6) Curriculum Maintenance and (7) Curriculum Re-engineering. All of these sequences consider the participation of the different stakeholders of the training program. They help develop the curriculum, which seeks to promote student learning according to the dictates of the goals and objectives of the program and technology development. Once the

  6. Network-based analysis of software change propagation.

    Science.gov (United States)

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  7. Deriving a research agenda for a financial service industry's methodology for carrying out business process re-engineering

    Directory of Open Access Journals (Sweden)

    Kader, I. A.

    2016-05-01

    Full Text Available Why do projects fail? This is a question that has been researched across various project disciplines, including that of Business Process Re-engineering (BPR. This paper introduces a different angle on why BPR projects fail. An analysis of a case study conducted within a financial institution revealed new factors that could influence BPR project outcomes, but that have not been identified in the literature. The Organisation Ring of Influence model was developed to indicate the impact that organisation behaviours and structures had on the outcome of an executed BPR project. This model also helps to highlight which factors were more influential than others.

  8. Software process improvement in CMS-are we different?

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2001-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise in our context means to evaluate and apply new technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards, while ensuring reproducibility and quality of results. The CMS process improvement effort is two-pronged. It aims at continuous improvement of the ways we do Object Oriented software, as well as continuous improvement in the efficiency of the working environment. In particular the use and creation of de-facto software process standards within CMS has proven to be key to successful software process improvement program. The authors describe the successful CMS implementation of a software process improvement strategy, following ISO 15504 since three years. The authors give the current status of the most important processes families formally established in CMS, and provide the guidelines followed both for tool development, and methodology establishment

  9. The Legacy of Space Shuttle Flight Software

    Science.gov (United States)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  10. Applied Research Study

    Science.gov (United States)

    Leach, Ronald J.

    1997-01-01

    The purpose of this project was to study the feasibility of reusing major components of a software system that had been used to control the operations of a spacecraft launched in the 1980s. The study was done in the context of a ground data processing system that was to be rehosted from a large mainframe to an inexpensive workstation. The study concluded that a systematic approach using inexpensive tools could aid in the reengineering process by identifying a set of certified reusable components. The study also developed procedures for determining duplicate versions of software, which were created because of inadequate naming conventions. Such procedures reduced reengineering costs by approximately 19.4 percent.

  11. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  12. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization......, and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements...... experience in creating and evolving the 4S telemedicine ecosystem. Conclusion The concept of software ecosystem architecture can be used analytically and constructively in respectively the analysis and design of software ecosystems....

  13. Two New Software Behavioral Design Patterns: Obligation Link and History Reminder

    Directory of Open Access Journals (Sweden)

    Stefan Andrei

    2016-06-01

    Full Text Available Finding proper design patterns has always been an important research topic in the software engineering community. One of the main responsibilities of the software developers is to determine which design pattern fits best to solve a particular problem. Design patterns support the effort of exploring the use of artificial intelligence in better management of software development and maintenance process by providing faster, less costly, smarter and on-time decisions (Pena-Mora & Vadhavkar, 1996. There has been a permanent interest in finding new design patterns, especially in the last two decades. Many new design patterns apply in various areas of computer science, such as software security, software parallelism, large-scale software evolving, artificial intelligence, and more. To the best of our knowledge, the “Obligation Link” and “History Reminder” design patterns are new and can be applied in software development in many areas of computer science including artificial intelligence.

  14. Re-Engineering Biosafety Regulations In India: Towards a Critique of Policy, Law and Prescriptions

    Directory of Open Access Journals (Sweden)

    A. Damodaran

    2005-06-01

    Full Text Available This article surveys the structure and essence of India’s biosafety regulations from an evolutionary perspective. After detailing the processes associated with the biosafety law and guidelines in the country, this article looks critically at recent efforts to re-engineer the regulations. It is argued that India’s biosafety regulations should move towards a more inclusive approach, which will facilitate transparent and informed decision-making, based on stakeholder-convergence. It is also suggested that the entire spectrum of laws and regulations that have a direct or indirect bearing on biosafety in India, need to be explored so that greater coherence could be secured in the management of biotechnology products that are sensitive to the environment. Drawing from the experience of the Bt cotton case, the article advocates a greater role for civil society and grassroots organizations.

  15. ALICES: an advanced object-oriented software workshop for simulators

    International Nuclear Information System (INIS)

    Sayet, R.L.; Rouault, G.; Pieroux, D.; Houte, U. Van

    1999-01-01

    Reducing simulator development costs while improving model quality, user-friendliness and teaching capabilities, is a major target for many years in the simulation industry. It has led to the development of specific software tools which have been improved progressively following the new features and capabilities offered by the software industry. Unlike most of these software tools, ALICES (which is a French acronym for 'Interactive Software Workshop for the Design of Simulators') is not an upgrade of a previous generation of tools, like putting a graphical front-end to a classical code generator, but a really new development. Its design specification is based on previous experience with different tools as well as on new capabilities of software technology, mainly in Object Oriented Design. This allowed us to make a real technological 'jump' in the simulation industry, beyond the constraints of some traditional approaches. The main objectives behind the development of ALICES were the following: (1) Minimizing the simulator development time and costs: a simulator development consists mainly in developing software. One way to reduce costs is to facilitate reuse of existing software by developing standard components, and by defining interface standards, (2) Insuring that the produced simulator can be maintained and updated at a minimal cost: a simulator must evolve along with the simulated process, and it is then necessary to update periodically the simulator. The cost of an adequate maintenance is highly dependent of the quality of the software workshop, (3) Covering the whole simulator development process: from the data package to the acceptance tests and for maintenance and upgrade activities; with the whole development team, even if it is dispatched at different working sites; respecting the Quality Assurance rules and procedures (CORYS T.E.S.S. and TRACTEBEL are ISO-9001 certified). The development of ALICES was also done to comply with the following two main

  16. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Hansen, Jonas; Roetter, Daniel Enrique Lucani; Krigslund, Jeppe

    2015-01-01

    Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm...

  17. A New Generation of Telecommunications for Mars: The Reconfigurable Software Radio

    Science.gov (United States)

    Adams, J.; Horne, W.

    2000-01-01

    Telecommunications is a critical component for any mission at Mars as it is an enabling function that provides connectivity back to Earth and provides a means for conducting science. New developments in telecommunications, specifically in software - configurable radios, expand the possible approaches for science missions at Mars. These radios provide a flexible and re-configurable platform that can evolve with the mission and that provide an integrated approach to communications and science data processing. Deep space telecommunication faces challenges not normally faced by terrestrial and near-earth communications. Radiation, thermal, highly constrained mass, volume, packaging and reliability all are significant issues. Additionally, once the spacecraft leaves earth, there is no way to go out and upgrade or replace radio components. The reconfigurable software radio is an effort to provide not only a product that is immediately usable in the harsh space environment but also to develop a radio that will stay current as the years pass and technologies evolve.

  18. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  19. Software architecture for the ORNL large-coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX-based data-acquisition system for the International Fusion Superconducting Magnet Test Facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second-generation system that evolved from a PDP-11/60-based system used during the initial phase of facility testing. The VAX-based software represents a layered implementation that provides integrated access to all of the data sources within the system, decoupling end-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring, and disposing data and control parameters for access from the data retrieval software. This paper describes the software architecture and the functionality incorporated into the various layers of the data system

  20. Software architecture for the ORNL large coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX-based data acquisition system for the International Fusion Superconducting Magnet Test Facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second-generation system that evolved from a PDP-11/60-based system used during the initial phase of facility testing. The VAX-based software represents a layered implementation that provides integrated access to all of the data sources within the system, deoupling end-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring and disposing data and control parameters for access from the data retrieval software. This paper describes the software architecture and the functionality incorporated into the various layers of the data system

  1. Object-Oriented Modular Architecture for Ground Combat Simulation

    National Research Council Canada - National Science Library

    Luqi; Berzins, V; Shing, M; Saluto, M; Williams, J

    2000-01-01

    .... It describes the effective use of computer-aided prototyping techniques for re-engineering the legacy software to develop an object-oriented modular architecture for the Janus combat simulation system. Janus...

  2. FRAMEWORK PARA CONVERSÃO DE APLICATIVOS DELPHI DESKTOP EM APLICATIVOS ANDROID NATIVO

    Directory of Open Access Journals (Sweden)

    Rodrigo da Silva Riquena

    2014-08-01

    Full Text Available With the growing use of mobile devices by companies and organizations there is an increasing demand applications in production mobile platform. For certain companies, business success may depend on a mobile application which approaches the customers or improve the performance of internal processes. However, developing software for the mobile platform is an expensive process which takes time and resources. A framework to convert Delphi Desktop applications into native Android applications in an automatic way constitutes a useful tool for architects and software developers can contribute with the implementation phase of the application. Therefore, this work is based on methods and processes for software reengineering as the PRE / OO (Process of Reengineering Object Oriented, for automatic conversion of an application developed in Delphi environment in an application for Android mobile platform. At last, an experiment was performed with a real case to corroborate the goals.

  3. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  4. Software architecture for the ORNL large coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX based data acquisition system for the international fusion superconducting magnetic test facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second generation system that evolved from a PDP-11/60 based system used during the initial phase of facility testing. The VAX based software represents a layered implementation that provides integrated access to all of the data sources within the system, decoupling en-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring, and disposing data and control parameters for access from the data retrieval software

  5. Open Source Next Generation Visualization Software for Interplanetary Missions

    Science.gov (United States)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  6. Software engineering for the EBR-II data acquisition system conversion

    International Nuclear Information System (INIS)

    Schorzman, W.

    1988-01-01

    The original data acquisition system (DAS) for the Experimental Breeder Reactor II (EBR-II) was placed into service with state-of-the-art computer and peripherals in 1970. Software engineering principles for real-time data acquisition were in their infancy, and the original software design was dictated by limited hardware resources. The functional requirements evolved from creative ways to gather and display data. This abstract concept developed into an invaluable tool for system analysis, data reporting, and as a plant monitor for operations. In this paper the approach is outlined to the software conversion project with the restraints of operational transparency and 6 weeks for final conversion and testing. The outline is then compared with the formal principles of software engineering to show the way that bridge the gap can be bridged between the theoretical and real world by analyzing the work and listing the lessons learned

  7. UPAYA MENINGKATKAN KEBERHASILAN IMPLEMENTASI ERP UNTUK MEMBANGUN KEUNGGULAN BERSAINGPADA UKM DI JAWA TENGAH

    Directory of Open Access Journals (Sweden)

    Mudiantono .

    2013-01-01

    Full Text Available Enterprise Resource Planning is an integrated software applied in the organization. This study aimed to determine the factors that influence the successful implementation of ERP in Small and Medium Enterprises (SMEs in Central Java to build competitive advantage. The hypothesis were tested by SEM to 107 respondents. Results of data analysis proved that the variable Business Process Reengineering was the most impact. It recommended that small and medium entrepreneurs to learn and consolidate prior efforts to reengineer its business before applying ERP.

  8. MSiReader v1.0: Evolving Open-Source Mass Spectrometry Imaging Software for Targeted and Untargeted Analyses

    Science.gov (United States)

    Bokhart, Mark T.; Nazari, Milad; Garrard, Kenneth P.; Muddiman, David C.

    2018-01-01

    A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. [Figure not available: see fulltext.

  9. Mapping modern software process engineering techniques onto an HEP development environment

    CERN Document Server

    Wellisch, J P

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off- line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within th...

  10. Robustness to Faults Promotes Evolvability: Insights from Evolving Digital Circuits.

    Science.gov (United States)

    Milano, Nicola; Nolfi, Stefano

    2016-01-01

    We demonstrate how the need to cope with operational faults enables evolving circuits to find more fit solutions. The analysis of the results obtained in different experimental conditions indicates that, in absence of faults, evolution tends to select circuits that are small and have low phenotypic variability and evolvability. The need to face operation faults, instead, drives evolution toward the selection of larger circuits that are truly robust with respect to genetic variations and that have a greater level of phenotypic variability and evolvability. Overall our results indicate that the need to cope with operation faults leads to the selection of circuits that have a greater probability to generate better circuits as a result of genetic variation with respect to a control condition in which circuits are not subjected to faults.

  11. Stakeholder Analysis as a Medium to Aid Change in Information System Reengineering Projects

    Directory of Open Access Journals (Sweden)

    Jean Davison

    2004-04-01

    Full Text Available The importance of involving stakeholders within a change process is well recognised, and successfully managed change is equally important. Information systems development and redesign is a form of change activity involving people and social issues and therefore resistance to change may occur. A stakeholder identification and analysis (SIA technique has been developed as an enhancement to PISO® (Process Improvement for Strategic Objectives, a method that engages the users of a system in the problem solving and reengineering of their own work-based problem areas. The SIA technique aids the identification and analysis of system stakeholders, and helps view the projected outcome of system changes and their effect on relevant stakeholders with attention being given to change resistance to ensure smooth negotiation and achieve consensus. A case study is presented here describing the successful implementation of a direct appointment booking system for patients within the National Health Service in the UK, utilising the SIA technique, which resulted in a feeling of empowerment and ownership of the change of those involved.

  12. Preface: evolving rotifers, evolving science: Proceedings of the XIV International Rotifer Symposium

    Czech Academy of Sciences Publication Activity Database

    Devetter, Miloslav; Fontaneto, D.; Jersabek, Ch.D.; Welch, D.B.M.; May, L.; Walsh, E.J.

    2017-01-01

    Roč. 796, č. 1 (2017), s. 1-6 ISSN 0018-8158 Institutional support: RVO:60077344 Keywords : evolving rotifers * 14th International Rotifer Symposium * evolving science Subject RIV: EG - Zoology OBOR OECD: Zoology Impact factor: 2.056, year: 2016

  13. An Introduction to Flight Software Development: FSW Today, FSW 2010

    Science.gov (United States)

    Gouvela, John

    2004-01-01

    Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by

  14. Radiobiology software for educational purpose

    International Nuclear Information System (INIS)

    Pandey, A.K.; Sharma, S.K.; Kumar, R.; Bal, C.S.; Nair, O.; Haresh, K.P.; Julka, P.K.

    2014-01-01

    To understand radio-nuclide therapy and the basis of radiation protection, it is essential to understand radiobiology. With limited time for classroom teaching and limited time and resources for radiobiology experiments students do not acquire firm grasp of theoretical mathematical models and experimental knowledge of target theory and Linear quadratic models that explain nature of cell survival curves. We believe that this issue might be addressed with numerical simulation of cell survival curves using mathematical models. Existing classroom teaching can be reoriented to understand the subject using the concept of modeling, simulation and virtual experiments. After completion of the lecture, students can practice with simulation tool at their convenient time. In this study we have developed software that can help the students to acquire firm grasp of theoretical and experimental radiobiology. The software was developed using FreeMat ver 4.0, open source software. Target theory, linear quadratic model, cell killing based on Poisson model have been included. The implementation of the program structure was to display the menu for the user choice to be made and then program flows depending on the users choice. The program executes by typing 'Radiobiology' on the command line interface. Students can investigate the effect of radiation dose on cell, interactively. They can practice to draw the cell survival curve based on the input and output data and they can also compare their handmade graphs with automatically generated graphs by the program. This software is in the early stage of development and will evolve on user feedback. We feel this simulation software will be quite useful for students entering in the nuclear medicine, radiology and radiotherapy disciplines. (author)

  15. Virtual immunology: software for teaching basic immunology.

    Science.gov (United States)

    Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio

    2013-01-01

    As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available free of charge in Portuguese and English, which can be used by teachers and students in physiology, immunology, and cellular biology classes. We discuss the development of the initial two modules: "Organs and Lymphoid Tissues" and "Inflammation" and the use of interactive activities to provide microscopic and macroscopic understanding in immunology. Students, both graduate and undergraduate, were questioned along with university level professors about the quality of the software and intuitiveness of use, facility of navigation, and aesthetic organization using a Likert scale. An overwhelmingly satisfactory result was obtained with both students and immunology teachers. Programs such as "Virtual Immunology" are offering more interactive, multimedia approaches to complex scientific principles that increase student motivation, interest, and comprehension. © 2013 by The International Union of Biochemistry and Molecular Biology.

  16. Zoneminder as ‘Software as a Service’ and Load Balancing of Video Surveillance Requests

    DEFF Research Database (Denmark)

    Deshmukh, Aaradhana A.; Mihovska, Albena D.; Prasad, Ramjee

    2012-01-01

    Cloud computing is evolving as a key computing platform for sharing resources that include infrastructures, softwares, applications, and business processes. Virtualization is a core technology for enabling cloud resource sharing. Software as a Service (SaaS) on the cloud platform provides software...... application vendors a Web based delivery model to serve large amount of clients with multi-tenancy based infrastructure and application sharing architecture so as to get great benefit from the economy of scale. The emergence of the Software-as-a-Service (SaaS) business model has attracted great attentions...... from both researchers and practitioners. SaaS vendors deliver on demand information processing services to users, and thus offer computing utility rather than the standalone software itself. This paper proposes a deployment of an open source video surveillance application named Zoneminder...

  17. Real-time Control Mediation in Agile Distributed Software Development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Aaen, Ivan; Mathiassen, Lars

    2008-01-01

    Agile distributed environments pose particular challenges related to control of quality and collaboration in software development. Moreover, while face-to-face interaction is fundamental in agile development, distributed environments must rely extensively on mediated interactions. On this backdrop...... control was mediated over distance by technology through real-time exchanges. Contrary to previous research, the analysis suggests that both formal and informal elements of real-time mediated control were used; that evolving goals and adjustment of expectations were two of the main issues in real......-time mediated control exchanges; and, that the actors, despite distances in space and culture, developed a clan-like pattern mediated by technology to help control quality and collaboration in software development....

  18. A software perspective of environmental data quality

    International Nuclear Information System (INIS)

    Banerjee, B.

    1995-01-01

    Because of the large amount of complex data in environmental projects, particularly large decontamination and decommissioning projects, the quality of the data has a profound impact on the success and cost of the mission. In every phase of the life cycle of the project, including regulatory intervention and legal proceedings, maintaining the quality of data and presenting data in a timely and meaningful manner are critical. In this paper, a systemic view of data quality management from a software engineering perspective is presented. A method of evaluation evolves from this view. This method complements the principles of the data quality objective. When graded adequately, the method of evaluation establishes a paradigm for ensuring data quality for new and renewed projects. This paper also demonstrates that incorporating good practices of software engineering into the data management process leads to continuous improvement of data quality

  19. Delineating slowly and rapidly evolving fractions of the Drosophila genome.

    Science.gov (United States)

    Keith, Jonathan M; Adams, Peter; Stephen, Stuart; Mattick, John S

    2008-05-01

    Evolutionary conservation is an important indicator of function and a major component of bioinformatic methods to identify non-protein-coding genes. We present a new Bayesian method for segmenting pairwise alignments of eukaryotic genomes while simultaneously classifying segments into slowly and rapidly evolving fractions. We also describe an information criterion similar to the Akaike Information Criterion (AIC) for determining the number of classes. Working with pairwise alignments enables detection of differences in conservation patterns among closely related species. We analyzed three whole-genome and three partial-genome pairwise alignments among eight Drosophila species. Three distinct classes of conservation level were detected. Sequences comprising the most slowly evolving component were consistent across a range of species pairs, and constituted approximately 62-66% of the D. melanogaster genome. Almost all (>90%) of the aligned protein-coding sequence is in this fraction, suggesting much of it (comprising the majority of the Drosophila genome, including approximately 56% of non-protein-coding sequences) is functional. The size and content of the most rapidly evolving component was species dependent, and varied from 1.6% to 4.8%. This fraction is also enriched for protein-coding sequence (while containing significant amounts of non-protein-coding sequence), suggesting it is under positive selection. We also classified segments according to conservation and GC content simultaneously. This analysis identified numerous sub-classes of those identified on the basis of conservation alone, but was nevertheless consistent with that classification. Software, data, and results available at www.maths.qut.edu.au/-keithj/. Genomic segments comprising the conservation classes available in BED format.

  20. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    Science.gov (United States)

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  1. A software product line approach to enhance a meta-scheduler middleware

    International Nuclear Information System (INIS)

    Scheidt, Rafael F; Schmidt, Katreen; Pessoa, Gabriel M; Viera, Matheus A; Dantas, Mario

    2012-01-01

    Software Projects in general tend to get more software reuse and componentization in order to reduce time, cost and new products resources. The need for techniques and tools to organize projects of higher quality in less time is one of the greatest challenges of Software Engineering. The Software Product Line is proposed to organize and systematically assist the development of new products in series at the same domain. In this context, this paper is proposed to apply the Software Product Line approach in Distributed Computing Environments. In projects that involve Distributed Environments, each version of the same product can generate repeatedly the same artifacts in a product that evolves its characteristics; however there is a principal architecture with variations of components. The goal of the proposed approach is to analyze the actual process and propose a new approach to develop new projects reusing the whole architecture, components and documents, starting with a solid base and creating new products focusing in new functionalities. We expect that with the application of this approach give support to the development of projects in Distributed Computing Environment.

  2. Mapping modern software process engineering techniques onto an HEP development environment

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R and D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software

  3. Mapping modern software process engineering techniques onto an HEP development environment

    Science.gov (United States)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  4. Co-verification of hardware and software for ARM SoC design

    CERN Document Server

    Andrews, Jason

    2004-01-01

    Hardware/software co-verification is how to make sure that embedded system software works correctly with the hardware, and that the hardware has been properly designed to run the software successfully -before large sums are spent on prototypes or manufacturing. This is the first book to apply this verification technique to the rapidly growing field of embedded systems-on-a-chip(SoC). As traditional embedded system design evolves into single-chip design, embedded engineers must be armed with the necessary information to make educated decisions about which tools and methodology to deploy. SoC verification requires a mix of expertise from the disciplines of microprocessor and computer architecture, logic design and simulation, and C and Assembly language embedded software. Until now, the relevant information on how it all fits together has not been available. Andrews, a recognized expert, provides in-depth information about how co-verification really works, how to be successful using it, and pitfalls to avoid. H...

  5. The NOvA software testing framework

    International Nuclear Information System (INIS)

    Tamsett, M; Group, C

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner. (paper)

  6. The Astringency of the GP Algorithm for Forecasting Software Failure Data Series

    Directory of Open Access Journals (Sweden)

    Yong-qiang Zhang

    2007-05-01

    Full Text Available The forecasting of software failure data series by Genetic Programming (GP can be realized without any assumptions before modeling. This discovery has transformed traditional statistical modeling methods as well as improved consistency for model applicability. The individuals' different characteristics during the evolution of generations, which are randomly changeable, are treated as Markov random processes. This paper also proposes that a GP algorithm with "optimal individuals reserved strategy" is the best solution to this problem, and therefore the adaptive individuals finally will be evolved. This will allow practical applications in software reliability modeling analysis and forecasting for failure behaviors. Moreover it can verify the feasibility and availability of the GP algorithm, which is applied to software failure data series forecasting on a theoretical basis. The results show that the GP algorithm is the best solution for software failure behaviors in a variety of disciplines.

  7. An evaluation of software tools for the design and development of cockpit displays

    Science.gov (United States)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  8. Natural selection promotes antigenic evolvability.

    Science.gov (United States)

    Graves, Christopher J; Ros, Vera I D; Stevenson, Brian; Sniegowski, Paul D; Brisson, Dustin

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios) and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish chronic infections.

  9. Natural selection promotes antigenic evolvability.

    Directory of Open Access Journals (Sweden)

    Christopher J Graves

    Full Text Available The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide an experimentally tractable system to test whether natural selection has favored mechanisms that increase evolvability. Many antigenic variation systems consist of paralogous unexpressed 'cassettes' that recombine into an expression site to rapidly alter the expressed protein. Importantly, the magnitude of antigenic change is a function of the genetic diversity among the unexpressed cassettes. Thus, evidence that selection favors among-cassette diversity is direct evidence that natural selection promotes antigenic evolvability. We used the Lyme disease bacterium, Borrelia burgdorferi, as a model to test the prediction that natural selection favors amino acid diversity among unexpressed vls cassettes and thereby promotes evolvability in a primary surface antigen, VlsE. The hypothesis that diversity among vls cassettes is favored by natural selection was supported in each B. burgdorferi strain analyzed using both classical (dN/dS ratios and Bayesian population genetic analyses of genetic sequence data. This hypothesis was also supported by the conservation of highly mutable tandem-repeat structures across B. burgdorferi strains despite a near complete absence of sequence conservation. Diversification among vls cassettes due to natural selection and mutable repeat structures promotes long-term antigenic evolvability of VlsE. These findings provide a direct demonstration that molecular mechanisms that enhance evolvability of surface antigens are an evolutionary adaptation. The molecular evolutionary processes identified here can serve as a model for the evolution of antigenic evolvability in many pathogens which utilize similar strategies to establish

  10. Challenges of reengineering into multi-tenant SaaS applications

    NARCIS (Netherlands)

    Bezemer, C.; Zaidman, A.

    2010-01-01

    Multi-tenancy is a relatively new software architecture principle in the realm of the Software as a Service (SaaS) business model. It allows to make full use of the economy of scale, as multiple customers "tenants" share the same application and database instance. All the while, the tenants enjoy a

  11. The Development of a Virtual Company to Support the Reengineering of the NASA/Goddard Hubble Space Telescope Control Center System

    Science.gov (United States)

    Lehtonen, Ken

    1999-01-01

    This is a report to the Third Annual International Virtual Company Conference, on The Development of a Virtual Company to Support the Reengineering of the NASA/Goddard Hubble Space Telescope (HST) Control Center System. It begins with a HST Science "Commercial": Brief Tour of Our Universe showing various pictures taken from the Hubble Space Telescope. The presentation then reviews the project background and goals. Evolution of the Control Center System ("CCS Inc.") is then reviewed. Topics of Interest to "virtual companies" are reviewed: (1) "How To Choose A Team" (2) "Organizational Model" (3) "The Human Component" (4) "'Virtual Trust' Among Teaming Companies" (5) "Unique Challenges to Working Horizontally" (6) "The Cultural Impact" (7) "Lessons Learned".

  12. Enterprise 2.0. : accountability and the necessity for Digital Archiving

    NARCIS (Netherlands)

    van Bussel, Geert-Jan

    2012-01-01

    In the last decade, organizations have re-engineered their business processes and started using standard software solutions. Integration of structured data in relational databases has improved documentation of business transactions and increased data quality. But almost 90% of the information cannot

  13. Evaluating the Governance Model of Hardware-Dependent Software Ecosystems – A Case Study of the Axis Ecosystem

    OpenAIRE

    Wnuk, Krzysztof; Manikas, Konstantinos; Runeson, Per; Matilda, Lantz; Oskar, Weijden; Munir, Hussan

    2014-01-01

    Ecosystem governance becomes gradually more relevant for a set of companies or actors characterized by symbiotic relations evolved on the top of a technological platform, i.e. a software ecosystem. In this study, we focus on the governance of a hardware-dependent software ecosystem. More specifically, we evaluate the governance model applied by Axis, a network video and surveillance camera producer, that is the platform owner and orchestrator of the Application Development Partner (ADP) softw...

  14. Feasibility of video codec algorithms for software-only playback

    Science.gov (United States)

    Rodriguez, Arturo A.; Morse, Ken

    1994-05-01

    Software-only video codecs can provide good playback performance in desktop computers with a 486 or 68040 CPU running at 33 MHz without special hardware assistance. Typically, playback of compressed video can be categorized into three tasks: the actual decoding of the video stream, color conversion, and the transfer of decoded video data from system RAM to video RAM. By current standards, good playback performance is the decoding and display of video streams of 320 by 240 (or larger) compressed frames at 15 (or greater) frames-per- second. Software-only video codecs have evolved by modifying and tailoring existing compression methodologies to suit video playback in desktop computers. In this paper we examine the characteristics used to evaluate software-only video codec algorithms, namely: image fidelity (i.e., image quality), bandwidth (i.e., compression) ease-of-decoding (i.e., playback performance), memory consumption, compression to decompression asymmetry, scalability, and delay. We discuss the tradeoffs among these variables and the compromises that can be made to achieve low numerical complexity for software-only playback. Frame- differencing approaches are described since software-only video codecs typically employ them to enhance playback performance. To complement other papers that appear in this session of the Proceedings, we review methods derived from binary pattern image coding since these methods are amenable for software-only playback. In particular, we introduce a novel approach called pixel distribution image coding.

  15. Business process re-engineering in the logistics industry: a study of implementation, success factors, and performance

    Science.gov (United States)

    Shen, Chien-wen; Chou, Ching-Chih

    2010-02-01

    As business process re-engineering (BPR) is an important foundation to ensure the success of enterprise systems, this study would like to investigate the relationships among BPR implementation, BPR success factors, and business performance for logistics companies. Our empirical findings show that BPR companies outperformed non-BPR companies, not only on information processing, technology applications, organisational structure, and co-ordination, but also on all of the major logistics operations. Comparing the different perceptions of the success factors for BPR, non-BPR companies place greater emphasis on the importance of employee involvement while BPR companies are more concerned about the influence of risk management. Our findings also suggest that management attitude towards BPR success factors could affect performance with regard to technology applications and logistics operations. Logistics companies which have not yet implemented the BPR approach could refer to our findings to evaluate the advantages of such an undertaking and to take care of those BPR success factors affecting performance before conducting BPR projects.

  16. Software Configuration Management For Multiple Releases: Influence On Development Effort

    Directory of Open Access Journals (Sweden)

    Sławomir P. Maludziński

    2007-01-01

    Full Text Available Software Configuration Management (SCM evolves together with the discipline of softwareengineering. Teams working on software products become larger and are geographically distributedat multiple sites. Collaboration between such groups requires well evaluated SCMplans and strategies to easy cooperation and decrease software development cost by reducingtime spent on SCM activities – branching and merging, that is effort utilized on creation ofrevisions (’serial’ versions and variants (’parallel’ versions. This paper suggests that SCMpractices should be combined with modular design and code refactoring to reduce cost relatedto maintenance of the same code line. Teams which produce several variants of thesame code line at the same time should use approaches like components, modularization, orplug-ins over code alternations maintained on version branches. Findings described in thispaper were taken by teams in charge of development of radio communication systems inMotorola GEMS divisions. Each team collaborating on similar projects used different SCMstrategies to develop parts of this system.

  17. Linux software for large topology optimization problems

    DEFF Research Database (Denmark)

    evolving product, which allows a parallel solution of the PDE, it lacks the important feature that the matrix-generation part of the computations is localized to each processor. This is well-known to be critical for obtaining a useful speedup on a Linux cluster and it motivates the search for a COMSOL......-like package for large topology optimization problems. One candidate for such software is developed for Linux by Sandia Nat’l Lab in the USA being the Sundance system. Sundance also uses a symbolic representation of the PDE and a scalable numerical solution is achieved by employing the underlying Trilinos...

  18. Moderation instead of modelling: some arguments against formal engineering methods

    NARCIS (Netherlands)

    Rauterberg, G.W.M.; Sikorski, M.; Rauterberg, G.W.M.

    1998-01-01

    The more formal the used engineering techniques are, the less non-technical facts can be captured. Several business process reengineering and software development projects fail, because the project management concentrates to much on formal methods and modelling approaches. A successful change of

  19. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  20. Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.

    Science.gov (United States)

    Janković, Srdja; Ćirković, Milan M

    2016-03-01

    Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.

  1. Natural selection promotes antigenic evolvability

    NARCIS (Netherlands)

    Graves, C.J.; Ros, V.I.D.; Stevenson, B.; Sniegowski, P.D.; Brisson, D.

    2013-01-01

    The hypothesis that evolvability - the capacity to evolve by natural selection - is itself the object of natural selection is highly intriguing but remains controversial due in large part to a paucity of direct experimental evidence. The antigenic variation mechanisms of microbial pathogens provide

  2. Managing Scientific Software Complexity with Bocca and CCA

    Directory of Open Access Journals (Sweden)

    Benjamin A. Allan

    2008-01-01

    Full Text Available In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enable application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.

  3. Disgust: Evolved function and structure

    NARCIS (Netherlands)

    Tybur, J.M.; Lieberman, D.; Kurzban, R.; DeScioli, P.

    2013-01-01

    Interest in and research on disgust has surged over the past few decades. The field, however, still lacks a coherent theoretical framework for understanding the evolved function or functions of disgust. Here we present such a framework, emphasizing 2 levels of analysis: that of evolved function and

  4. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  5. Re-engineering quality related processes and activities

    International Nuclear Information System (INIS)

    Preisser, T.E.

    1995-01-01

    Given both desire and opportunity, improvements to program quality hinge upon a thorough understanding of what processes are currently performed, which are necessary to support the product or service, and what ideal processes should look like. Thorough understanding derives from process analysis, process mapping, and the use of other quality tools. Despite the level of knowledge any process team claims, there is likely to be at least one area that was hidden before the process was deeply analyzed. Finding that hidden element may mean the difference between evolving an improvement versus a breakthrough

  6. Improving Software Sustainability: Lessons Learned from Profiles in Science.

    Science.gov (United States)

    Gallagher, Marie E

    2013-01-01

    The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment

  7. Frequency Estimator Performance for a Software-Based Beacon Receiver

    Science.gov (United States)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  8. Maintenance Research in SOA Towards a Standard Case Study

    NARCIS (Netherlands)

    Espinha, T.; Chen, C.; Zaidman, A.E.; Gross, H.G.

    2012-01-01

    Preprint of paper published in: 16th European Conference on Software Maintenance and Reengineering (CSMR), 27-30 March 2012; doi:10.1109/CSMR.2012.49 Maintenance research in the context of Service Oriented Architecture (SOA) is currently lacking a suitable standard case study that can be used by

  9. A neighbourhood evolving network model

    International Nuclear Information System (INIS)

    Cao, Y.J.; Wang, G.Z.; Jiang, Q.Y.; Han, Z.X.

    2006-01-01

    Many social, technological, biological and economical systems are best described by evolved network models. In this short Letter, we propose and study a new evolving network model. The model is based on the new concept of neighbourhood connectivity, which exists in many physical complex networks. The statistical properties and dynamics of the proposed model is analytically studied and compared with those of Barabasi-Albert scale-free model. Numerical simulations indicate that this network model yields a transition between power-law and exponential scaling, while the Barabasi-Albert scale-free model is only one of its special (limiting) cases. Particularly, this model can be used to enhance the evolving mechanism of complex networks in the real world, such as some social networks development

  10. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  11. Integrated software environment dedicated for implementation of control systems based on PLC controllers

    Directory of Open Access Journals (Sweden)

    Szymon SURMA

    2007-01-01

    Full Text Available Industrial processes’ control systems based on PLC controllers play today a very important role in all fields of transport, including also sea transport. Construction of control systems is the field of engineering, which has been continuously evolving towards maximum simplification of system design path. Up to now the time needed forthe system construction from the design to commissioning had to be divided into a few stages. A mistake made in an earlier stage caused that in most cases the next stages had to be restarted. Available debugging systems allows defect detection at an early stage of theproject implementation. The paper presents general characteristic of integrated software for implementation of complex control systems. The issues related to the software use for programming of the visualisation environment, control computer, selection oftransmission medium and transmission protocol as well as PLC controllers’ configuration, software and control have been analysed.

  12. On the Critical Role of Divergent Selection in Evolvability

    Directory of Open Access Journals (Sweden)

    Joel Lehman

    2016-08-01

    Full Text Available An ambitious goal in evolutionary robotics is to evolve increasingly complex robotic behaviors with minimal human design effort. Reaching this goal requires evolutionary algorithms that can unlock from genetic encodings their latent potential for evolvability. One issue clouding this goal is conceptual confusion about evolvability, which often obscures the aspects of evolvability that are important or desirable. The danger from such confusion is that it may establish unrealistic goals for evolvability that prove unproductive in practice. An important issue separate from conceptual confusion is the common misalignment between selection and evolvability in evolutionary robotics. While more expressive encodings can represent higher-level adaptations (e.g. sexual reproduction or developmental systems that increase long-term evolutionary potential (i.e. evolvability, realizing such potential requires gradients of fitness and evolvability to align. In other words, selection is often a critical factor limiting increasing evolvability. Thus, drawing from a series of recent papers, this article seeks to both (1 clarify and focus the ways in which the term evolvability is used within artificial evolution, and (2 argue for the importance of one type of selection, i.e. divergent selection, for enabling evolvability. The main argument is that there is a fundamental connection between divergent selection and evolvability (on both the individual and population level that does not hold for typical goal-oriented selection. The conclusion is that selection pressure plays a critical role in realizing the potential for evolvability, and that divergent selection in particular provides a principled mechanism for encouraging evolvability in artificial evolution.

  13. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  14. Evolved H II regions

    International Nuclear Information System (INIS)

    Churchwell, E.

    1975-01-01

    A probable evolutionary sequence of H II regions based on six distinct types of observed objects is suggested. Two examples which may deviate from this idealized sequence, are discussed. Even though a size-mean density relation of H II regions can be used as a rough indication of whether a nebula is very young or evolved, it is argued that such a relation is not likely to be useful for the quantitative assignment of ages to H II regions. Evolved H II regions appear to fit into one of four structural types: rings, core-halos, smooth structures, and irregular or filamentary structures. Examples of each type are given with their derived physical parameters. The energy balance in these nebulae is considered. The mass of ionized gas in evolved H II regions is in general too large to trace the nebula back to single compact H II regions. Finally, the morphological type of the Galaxy is considered from its H II region content. 2 tables, 2 figs., 29 refs

  15. Software engineering and Ada (Trademark) training: An implementation model for NASA

    Science.gov (United States)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  16. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  17. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  18. Evolving attractive faces using morphing technology and a genetic algorithm: a new approach to determining ideal facial aesthetics.

    Science.gov (United States)

    Wong, Brian J F; Karimi, Koohyar; Devcic, Zlatko; McLaren, Christine E; Chen, Wen-Pin

    2008-06-01

    The objectives of this study were to: 1) determine if a genetic algorithm in combination with morphing software can be used to evolve more attractive faces; and 2) evaluate whether this approach can be used as a tool to define or identify the attributes of the ideal attractive face. Basic research study incorporating focus group evaluations. Digital images were acquired of 250 female volunteers (18-25 y). Randomly selected images were used to produce a parent generation (P) of 30 synthetic faces using morphing software. Then, a focus group of 17 trained volunteers (18-25 y) scored each face on an attractiveness scale ranging from 1 (unattractive) to 10 (attractive). A genetic algorithm was used to select 30 new pairs from the parent generation, and these were morphed using software to produce a new first generation (F1) of faces. The F1 faces were scored by the focus group, and the process was repeated for a total of four iterations of the algorithm. The algorithm mimics natural selection by using the attractiveness score as the selection pressure; the more attractive faces are more likely to morph. All five generations (P-F4) were then scored by three focus groups: a) surgeons (n = 12), b) cos-metology students (n = 44), and c) undergraduate students (n = 44). Morphometric measurements were made of 33 specific features on each of the 150 synthetic faces, and correlated with attractiveness scores using univariate and multivariate analysis. The average facial attractiveness scores increased with each generation and were 3.66 (+0.60), 4.59 (+/-0.73), 5.50 (+/-0.62), 6.23 (+/-0.31), and 6.39 (+/-0.24) for P and F1-F4 generations, respectively. Histograms of attractiveness score distributions show a significant shift in the skew of each curve toward more attractive faces with each generation. Univariate analysis identified nasal width, eyebrow arch height, and lip thickness as being significantly correlated with attractiveness scores. Multivariate analysis identified a

  19. Riding the Crest of the E-Commerce Wave: Transforming MIT's Campus Computer Resale Operation.

    Science.gov (United States)

    Hallisey, Joanne

    1998-01-01

    Reengineering efforts, vendor consolidation, and rising costs prompted the Massachusetts Institute of Technology to convert its computer resale store to an online catalog that allows students, faculty, and staff to purchase equipment and software through a World Wide Web interface. The transition has been greeted with a mixed reaction. The next…

  20. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  1. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  2. A new practice-driven approach to develop software in a cyber-physical system environment

    Science.gov (United States)

    Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei

    2016-02-01

    Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.

  3. PRINCIPLES OF RE-ENGINEERING METHODOLOGY FOR TECHNOLOGICAL PROCESS IN PROCESSING OF RAW MATERIAL COMPONENTS WHILE PRODUCING CEMENT AND SILICATE PRODUCTS

    Directory of Open Access Journals (Sweden)

    I. A. Busel

    2014-01-01

    Full Text Available Grinding process is characterized by high energy consumption and low productivity. Nowadays efficiency of the ball mills applied for grinding is rather low. Only 3-6 % of the supplied power energy is used for material grinding. The rest part of the energy disappears in the form of heat, vibration and noise. So matter concerning reduction of energy consumption is of great importance.Improvement of efficiency and quality of technological process in grinding of raw material components while producing construction materials is considered as one of priority-oriented targets of power- and resource saving in construction industry with the purpose to reduce energy consumption for grinding. Grinding efficiency at operating enterprises is reasonable to improve by modernization of the equipment and existing technological, management and other processes which are related to grinding of mineral raw material. In order to reduce grinding power consumption it is necessary to carry out a complex re-engineering of technological process in grinding of various materials which is based on usage of new modifications of grinding bodies, physical and chemical grinding aids, modern information technologies and industrial automation equipment. Application of modern information technologies and industrial automation equipment makes it possible to execute the grinding process with maximum achievable productivity for existing capacity due to automatic control and consideration of continuous changes in technological parameters. In addition to this such approach gives an opportunity to control processes in real time by immediate adjustments of technological equipment operational modes.The paper considers an approach to the development of re-engineering methodology for technological process in grinding of raw material components while producing construction materials. The present state of technological grinding process is presented in the paper. The paper points out the

  4. Evolving Technologies: A View to Tomorrow

    Science.gov (United States)

    Tamarkin, Molly; Rodrigo, Shelley

    2011-01-01

    Technology leaders must participate in strategy creation as well as operational delivery within higher education institutions. The future of higher education--the view to tomorrow--is irrevocably integrated and intertwined with evolving technologies. This article focuses on two specific evolving technologies: (1) alternative IT sourcing; and (2)…

  5. Profile of NASA software engineering: Lessons learned from building the baseline

    Science.gov (United States)

    Hall, Dana; Mcgarry, Frank

    1993-01-01

    It is critically important in any improvement activity to first understand the organization's current status, strengths, and weaknesses and, only after that understanding is achieved, examine and implement promising improvements. This fundamental rule is certainly true for an organization seeking to further its software viability and effectiveness. This paper addresses the role of the organizational process baseline in a software improvement effort and the lessons we learned assembling such an understanding for NASA overall and for the NASA Goddard Space Flight Center in particular. We discuss important, core data that must be captured and contrast that with our experience in actually finding such information. Our baselining efforts have evolved into a set of data gathering, analysis, and crosschecking techniques and information presentation formats that may prove useful to others seeking to establish similar baselines for their organization.

  6. Portable image-manipulation software: what is the extra development cost?

    Science.gov (United States)

    Ligier, Y; Ratib, O; Funk, M; Perrier, R; Girard, C; Logean, M

    1992-08-01

    A hospital-wide picture archiving and communication system (PACS) project is currently under development at the University Hospital of Geneva. The visualization and manipulation of images provided by different imaging modalities constitutes one of the most challenging component of a PACS. It was necessary to provide this visualization software on a number of types of workstations because of the varying requirements imposed by the range of clinical uses it must serve. The user interface must be the same, independent of the underlying workstation. In addition to a standard set of image-manipulation and processing tools, there is a need for more specific clinical tools that can be easily adapted to specific medical requirements. To achieve this goal, it was elected to develop a modular and portable software called OSIRIS. This software is available on two different operating systems (the UNIX standard X-11/OSF-Motif based workstations and the Macintosh family) and can be easily ported to other systems. The extra effort required to design such software in a modular and portable way was worthwhile because it resulted in a platform that can be easily expanded and adapted to a variety of specific clinical applications. Its portability allows users to benefit from the rapidly evolving workstation technology and to adapt the performance to suit their needs.

  7. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  8. Spacetimes containing slowly evolving horizons

    International Nuclear Information System (INIS)

    Kavanagh, William; Booth, Ivan

    2006-01-01

    Slowly evolving horizons are trapping horizons that are ''almost'' isolated horizons. This paper reviews their definition and discusses several spacetimes containing such structures. These include certain Vaidya and Tolman-Bondi solutions as well as (perturbatively) tidally distorted black holes. Taking into account the mass scales and orders of magnitude that arise in these calculations, we conjecture that slowly evolving horizons are the norm rather than the exception in astrophysical processes that involve stellar-scale black holes

  9. Source Code Vulnerabilities in IoT Software Systems

    Directory of Open Access Journals (Sweden)

    Saleh Mohamed Alnaeli

    2017-08-01

    Full Text Available An empirical study that examines the usage of known vulnerable statements in software systems developed in C/C++ and used for IoT is presented. The study is conducted on 18 open source systems comprised of millions of lines of code and containing thousands of files. Static analysis methods are applied to each system to determine the number of unsafe commands (e.g., strcpy, strcmp, and strlen that are well-known among research communities to cause potential risks and security concerns, thereby decreasing a system’s robustness and quality. These unsafe statements are banned by many companies (e.g., Microsoft. The use of these commands should be avoided from the start when writing code and should be removed from legacy code over time as recommended by new C/C++ language standards. Each system is analyzed and the distribution of the known unsafe commands is presented. Historical trends in the usage of the unsafe commands of 7 of the systems are presented to show how the studied systems evolved over time with respect to the vulnerable code. The results show that the most prevalent unsafe command used for most systems is memcpy, followed by strlen. These results can be used to help train software developers on secure coding practices so that they can write higher quality software systems.

  10. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  11. Re-engineering therapeutic antibodies for Alzheimer's disease as blood-brain barrier penetrating bi-specific antibodies.

    Science.gov (United States)

    Pardridge, William M

    2016-12-01

    Therapeutic antibodies are large molecule drugs that do not cross the blood-brain barrier (BBB). Therefore, drug development of therapeutic antibodies for Alzheimer's disease (AD) requires that these molecules be re-engineered to enable BBB delivery. This is possible by joining the therapeutic antibody with a transporter antibody, resulting in the engineering of a BBB-penetrating bispecific antibody (BSA). Areas covered: The manuscript covers transporter antibodies that cross the BBB via receptor-mediated transport systems on the BBB, such as the insulin receptor or transferrin receptor. Furthermore, it highlights therapeutic antibodies for AD that target the Abeta amyloid peptide, beta secretase-1, or the metabotropic glutamate receptor-1. BSAs are comprised of both the transporter antibody and the therapeutic antibody, as well as IgG constant region, which can induce immune tolerance or trigger transport via Fc receptors. Expert opinion: Multiple types of BSA molecular designs have been used to engineer BBB-penetrating BSAs, which differ in valency and spatial orientation of the transporter and therapeutic domains of the BSA. The plasma pharmacokinetics and dosing regimens of BSAs differ from that of conventional therapeutic antibodies. BBB-penetrating BSAs may be engineered in the future as new treatments of AD, as well as other neural disorders.

  12. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  13. Use of FPGA to face electronic component obsolescence in software based safety I and C in NPPS

    International Nuclear Information System (INIS)

    Hadj, Abdellah; Bach, Julien; Esmenjaud, Claude; Daumas, Frederic; Salauen, Patrick

    2010-01-01

    In order to extend the life time of their Nuclear Power Plants (NPPs), most utilities are looking for ways to implement the renovation of their existing Instrumentation and Control (I and C) systems. When the I and C to modernize is software based, three paths can be considered: - to keep the legacy microprocessor and limit refurbishment to the associated hardware (i.e. the I/O boards, memories and the CPU board itself), - to move to another I and C platform based on another microprocessor, - to move to a non microprocessor based I and C platform. Software based I and C provide strong advantages such as flexibility and ability to implement advanced functions, however the complexity and the decreasing life time of nowadays microprocessors, mainly developed for the needs of the personal computer market, makes difficult their use and licensing for safety digital I and C systems. Solutions based on re-engineering of legacy microprocessors, or use of microprocessors dedicated to critical application need to be considered. In order to share a prospective vision of the future of I and C systems in NPPs, Electricite de France (EDF) Research and Development division and Rolls-Royce have launched a three year cooperation program on the use of the ASIC/FPGA technology in safety I and C systems. The first step of this program addresses the ability of the ASIC/FPGA technology to provide replacement solutions for former microprocessors taking as example the replacement of the Motorola MC6800 microprocessor. This paper presents the development of an IP cloning the Motorola MC6800 microprocessor, suitable for use in the refurbishment of safety I and C equipment based on this microprocessor. (authors)

  14. NASA's Advanced Multimission Operations System: A Case Study in Formalizing Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2011-01-01

    All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.

  15. Evolving phenotypic networks in silico.

    Science.gov (United States)

    François, Paul

    2014-11-01

    Evolved gene networks are constrained by natural selection. Their structures and functions are consequently far from being random, as exemplified by the multiple instances of parallel/convergent evolution. One can thus ask if features of actual gene networks can be recovered from evolutionary first principles. I review a method for in silico evolution of small models of gene networks aiming at performing predefined biological functions. I summarize the current implementation of the algorithm, insisting on the construction of a proper "fitness" function. I illustrate the approach on three examples: biochemical adaptation, ligand discrimination and vertebrate segmentation (somitogenesis). While the structure of the evolved networks is variable, dynamics of our evolved networks are usually constrained and present many similar features to actual gene networks, including properties that were not explicitly selected for. In silico evolution can thus be used to predict biological behaviours without a detailed knowledge of the mapping between genotype and phenotype. Copyright © 2014 The Author. Published by Elsevier Ltd.. All rights reserved.

  16. Designing the modern pump: engineering aspects of continuous subcutaneous insulin infusion software.

    Science.gov (United States)

    Welsh, John B; Vargas, Steven; Williams, Gary; Moberg, Sheldon

    2010-06-01

    Insulin delivery systems attracted the efforts of biological, mechanical, electrical, and software engineers well before they were commercially viable. The introduction of the first commercial insulin pump in 1983 represents an enduring milestone in the history of diabetes management. Since then, pumps have become much more than motorized syringes and have assumed a central role in diabetes management by housing data on insulin delivery and glucose readings, assisting in bolus estimation, and interfacing smoothly with humans and compatible devices. Ensuring the integrity of the embedded software that controls these devices is critical to patient safety and regulatory compliance. As pumps and related devices evolve, software engineers will face challenges and opportunities in designing pumps that are safe, reliable, and feature-rich. The pumps and related systems must also satisfy end users, healthcare providers, and regulatory authorities. In particular, pumps that are combined with glucose sensors and appropriate algorithms will provide the basis for increasingly safe and precise automated insulin delivery-essential steps to developing a fully closed-loop system.

  17. Implementing Kanban for agile process management within the ALMA Software Operations Group

    Science.gov (United States)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  18. canEvolve: a web portal for integrative oncogenomics.

    Directory of Open Access Journals (Sweden)

    Mehmet Kemal Samur

    Full Text Available BACKGROUND & OBJECTIVE: Genome-wide profiles of tumors obtained using functional genomics platforms are being deposited to the public repositories at an astronomical scale, as a result of focused efforts by individual laboratories and large projects such as the Cancer Genome Atlas (TCGA and the International Cancer Genome Consortium. Consequently, there is an urgent need for reliable tools that integrate and interpret these data in light of current knowledge and disseminate results to biomedical researchers in a user-friendly manner. We have built the canEvolve web portal to meet this need. RESULTS: canEvolve query functionalities are designed to fulfill most frequent analysis needs of cancer researchers with a view to generate novel hypotheses. canEvolve stores gene, microRNA (miRNA and protein expression profiles, copy number alterations for multiple cancer types, and protein-protein interaction information. canEvolve allows querying of results of primary analysis, integrative analysis and network analysis of oncogenomics data. The querying for primary analysis includes differential gene and miRNA expression as well as changes in gene copy number measured with SNP microarrays. canEvolve provides results of integrative analysis of gene expression profiles with copy number alterations and with miRNA profiles as well as generalized integrative analysis using gene set enrichment analysis. The network analysis capability includes storage and visualization of gene co-expression, inferred gene regulatory networks and protein-protein interaction information. Finally, canEvolve provides correlations between gene expression and clinical outcomes in terms of univariate survival analysis. CONCLUSION: At present canEvolve provides different types of information extracted from 90 cancer genomics studies comprising of more than 10,000 patients. The presence of multiple data types, novel integrative analysis for identifying regulators of oncogenesis, network

  19. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  20. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  1. Business Process Reengineering- Can a Management Strategy improve the Working Environment ?

    DEFF Research Database (Denmark)

    Koch, Christian

    1997-01-01

    Ergonomists need to adopt a more proactive approach to management concepts. it is insufficient to wait until the workplace examples have evolved.In the contribution BPR is used as an examplar of a typical contemporary management concept....

  2. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  3. Conversion of Sox17 into a pluripotency reprogramming factor by reengineering its association with Oct4 on DNA.

    Science.gov (United States)

    Jauch, Ralf; Aksoy, Irene; Hutchins, Andrew Paul; Ng, Calista Keow Leng; Tian, Xian Feng; Chen, Jiaxuan; Palasingam, Paaventhan; Robson, Paul; Stanton, Lawrence W; Kolatkar, Prasanna R

    2011-06-01

    Very few proteins are capable to induce pluripotent stem (iPS) cells and their biochemical uniqueness remains unexplained. For example, Sox2 cooperates with other transcription factors to generate iPS cells, but Sox17, despite binding to similar DNA sequences, cannot. Here, we show that Sox2 and Sox17 exhibit inverse heterodimerization preferences with Oct4 on the canonical versus a newly identified compressed sox/oct motif. We can swap the cooperativity profiles of Sox2 and Sox17 by exchanging single amino acids at the Oct4 interaction interface resulting in Sox2KE and Sox17EK proteins. The reengineered Sox17EK now promotes reprogramming of somatic cells to iPS, whereas Sox2KE has lost this potential. Consistently, when Sox2KE is overexpressed in embryonic stem cells it forces endoderm differentiation similar to wild-type Sox17. Together, we demonstrate that strategic point mutations that facilitate Sox/Oct4 dimer formation on variant DNA motifs lead to a dramatic swap of the bioactivities of Sox2 and Sox17. Copyright © 2011 AlphaMed Press.

  4. Architecture and method for optimization of cloud resources used in software testing

    Directory of Open Access Journals (Sweden)

    Joana Coelho Vigário

    2016-03-01

    Full Text Available Nowadays systems can evolve quickly, and to this growth is associated, for example, the production of new features, or even the change of system perspective, required by the stakeholders. These conditions require the development of software testing in order to validate the systems. Run a large battery of tests sequentially can take hours. However, tests can run faster in a distributed environment with rapid availability of pre-configured systems, such as cloud computing. There is increasing demand for automation of the entire process, including integration, build, running tests and management of cloud resources.This paper aims to demonstrate the applicability of the practice continuous integration (CI in Information Systems, for automating the build and software testing performed in a distributed environment of cloud computing, in order to achieve optimization and elasticity of the resources provided by the cloud.

  5. On the Benefits of Divergent Search for Evolved Representations

    DEFF Research Database (Denmark)

    Lehman, Joel; Risi, Sebastian; Stanley, Kenneth O

    2012-01-01

    Evolved representations in evolutionary computation are often fragile, which can impede representation-dependent mechanisms such as self-adaptation. In contrast, evolved representations in nature are robust, evolvable, and creatively exploit available representational features. This paper provide...

  6. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  7. The genotype-phenotype map of an evolving digital organism

    OpenAIRE

    Fortuna, Miguel A.; Zaman, Luis; Ofria, Charles; Wagner, Andreas

    2017-01-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms fr...

  8. EVOLVE

    CERN Document Server

    Deutz, André; Schütze, Oliver; Legrand, Pierrick; Tantar, Emilia; Tantar, Alexandru-Adrian

    2017-01-01

    This book comprises nine selected works on numerical and computational methods for solving multiobjective optimization, game theory, and machine learning problems. It provides extended versions of selected papers from various fields of science such as computer science, mathematics and engineering that were presented at EVOLVE 2013 held in July 2013 at Leiden University in the Netherlands. The internationally peer-reviewed papers include original work on important topics in both theory and applications, such as the role of diversity in optimization, statistical approaches to combinatorial optimization, computational game theory, and cell mapping techniques for numerical landscape exploration. Applications focus on aspects including robustness, handling multiple objectives, and complex search spaces in engineering design and computational biology.

  9. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  10. Revisiting Robustness and Evolvability: Evolution in Weighted Genotype Spaces

    Science.gov (United States)

    Partha, Raghavendran; Raman, Karthik

    2014-01-01

    Robustness and evolvability are highly intertwined properties of biological systems. The relationship between these properties determines how biological systems are able to withstand mutations and show variation in response to them. Computational studies have explored the relationship between these two properties using neutral networks of RNA sequences (genotype) and their secondary structures (phenotype) as a model system. However, these studies have assumed every mutation to a sequence to be equally likely; the differences in the likelihood of the occurrence of various mutations, and the consequence of probabilistic nature of the mutations in such a system have previously been ignored. Associating probabilities to mutations essentially results in the weighting of genotype space. We here perform a comparative analysis of weighted and unweighted neutral networks of RNA sequences, and subsequently explore the relationship between robustness and evolvability. We show that assuming an equal likelihood for all mutations (as in an unweighted network), underestimates robustness and overestimates evolvability of a system. In spite of discarding this assumption, we observe that a negative correlation between sequence (genotype) robustness and sequence evolvability persists, and also that structure (phenotype) robustness promotes structure evolvability, as observed in earlier studies using unweighted networks. We also study the effects of base composition bias on robustness and evolvability. Particularly, we explore the association between robustness and evolvability in a sequence space that is AU-rich – sequences with an AU content of 80% or higher, compared to a normal (unbiased) sequence space. We find that evolvability of both sequences and structures in an AU-rich space is lesser compared to the normal space, and robustness higher. We also observe that AU-rich populations evolving on neutral networks of phenotypes, can access less phenotypic variation compared to

  11. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  12. IDC Reengineering Phase 2 & 3 Rough Order of Magnitude (ROM) Cost Estimate Summary (Leveraged NDC Case).

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James M.; Prescott, Ryan; Dawson, Jericah M.; Huelskamp, Robert M.

    2014-11-01

    Sandia National Laboratories has prepared a ROM cost estimate for budgetary planning for the IDC Reengineering Phase 2 & 3 effort, based on leveraging a fully funded, Sandia executed NDC Modernization project. This report provides the ROM cost estimate and describes the methodology, assumptions, and cost model details used to create the ROM cost estimate. ROM Cost Estimate Disclaimer Contained herein is a Rough Order of Magnitude (ROM) cost estimate that has been provided to enable initial planning for this proposed project. This ROM cost estimate is submitted to facilitate informal discussions in relation to this project and is NOT intended to commit Sandia National Laboratories (Sandia) or its resources. Furthermore, as a Federally Funded Research and Development Center (FFRDC), Sandia must be compliant with the Anti-Deficiency Act and operate on a full-cost recovery basis. Therefore, while Sandia, in conjunction with the Sponsor, will use best judgment to execute work and to address the highest risks and most important issues in order to effectively manage within cost constraints, this ROM estimate and any subsequent approved cost estimates are on a 'full-cost recovery' basis. Thus, work can neither commence nor continue unless adequate funding has been accepted and certified by DOE.

  13. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  14. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  15. Event visualisation in ATLAS: current software technologies, future prospects and trends

    CERN Document Server

    Bianchi, Riccardo-Maria; The ATLAS collaboration; Moyse, Edward

    2016-01-01

    At the beginning, HEP experiments made use of photographical images both to record and store experimental data and to illustrate their findings. Then the experiments evolved and needed to find ways to visualize their data. With the availability of computer graphics, software packages to display event data and the detector geometry started to be developed. Here a brief history of event displays is presented, with an overview of the different event display tools used today in HEP experiments in general, and in the LHC experiments in particular. Then the case of the ATLAS experiment is considered in more detail and two widely used event display packages are presented, Atlantis and VP1, focusing on the software technologies they employ, as well as their strengths, differences and their usage in the experiment: from physics analysis to detector development, and from online monitoring to outreach and communication. Future development plans and improvements in the ATLAS event display packages will also be discussed,...

  16. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  17. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  18. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  19. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  20. Maintaining evolvability.

    Science.gov (United States)

    Crow, James F

    2008-12-01

    Although molecular methods, such as QTL mapping, have revealed a number of loci with large effects, it is still likely that the bulk of quantitative variability is due to multiple factors, each with small effect. Typically, these have a large additive component. Conventional wisdom argues that selection, natural or artificial, uses up additive variance and thus depletes its supply. Over time, the variance should be reduced, and at equilibrium be near zero. This is especially expected for fitness and traits highly correlated with it. Yet, populations typically have a great deal of additive variance, and do not seem to run out of genetic variability even after many generations of directional selection. Long-term selection experiments show that populations continue to retain seemingly undiminished additive variance despite large changes in the mean value. I propose that there are several reasons for this. (i) The environment is continually changing so that what was formerly most fit no longer is. (ii) There is an input of genetic variance from mutation, and sometimes from migration. (iii) As intermediate-frequency alleles increase in frequency towards one, producing less variance (as p --> 1, p(1 - p) --> 0), others that were originally near zero become more common and increase the variance. Thus, a roughly constant variance is maintained. (iv) There is always selection for fitness and for characters closely related to it. To the extent that the trait is heritable, later generations inherit a disproportionate number of genes acting additively on the trait, thus increasing genetic variance. For these reasons a selected population retains its ability to evolve. Of course, genes with large effect are also important. Conspicuous examples are the small number of loci that changed teosinte to maize, and major phylogenetic changes in the animal kingdom. The relative importance of these along with duplications, chromosome rearrangements, horizontal transmission and polyploidy

  1. Evolving from Planning and Scheduling to Real-Time Operations Support: Design Challenges

    Science.gov (United States)

    Marquez, Jessica J.; Ludowise, Melissa; McCurdy, Michael; Li, Jack

    2010-01-01

    Versions of Scheduling and Planning Interface for Exploration (SPIFe) have supported a variety of mission operations across NASA. This software tool has evolved and matured over several years, assisting planners who develop intricate schedules. While initially conceived for surface Mars missions, SPIFe has been deployed in other domains, where people rather than robotic explorers, execute plans. As a result, a diverse set of end-users has compelled growth in a new direction: supporting real-time operations. This paper describes the new needs and challenges that accompany this development. Among the key features that have been built for SPIFe are current time indicators integrated into the interface and timeline, as well as other plan attributes that enable execution of scheduled activities. Field tests include mission support for the Lunar CRater Observation and Sensing Satellite (LCROSS), NASA Extreme Environment Mission Operations (NEEMO) and Desert Research and Technology Studies (DRATS) campaigns.

  2. The genotype-phenotype map of an evolving digital organism.

    Directory of Open Access Journals (Sweden)

    Miguel A Fortuna

    2017-02-01

    Full Text Available To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences, which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable.

  3. The genotype-phenotype map of an evolving digital organism.

    Science.gov (United States)

    Fortuna, Miguel A; Zaman, Luis; Ofria, Charles; Wagner, Andreas

    2017-02-01

    To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10141 genotypes (instruction sequences), which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable.

  4. Heterogeneous scalable framework for multiphase flows

    Energy Technology Data Exchange (ETDEWEB)

    Morris, Karla Vanessa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    Two categories of challenges confront the developer of computational spray models: those related to the computation and those related to the physics. Regarding the computation, the trend towards heterogeneous, multi- and many-core platforms will require considerable re-engineering of codes written for the current supercomputing platforms. Regarding the physics, accurate methods for transferring mass, momentum and energy from the dispersed phase onto the carrier fluid grid have so far eluded modelers. Significant challenges also lie at the intersection between these two categories. To be competitive, any physics model must be expressible in a parallel algorithm that performs well on evolving computer platforms. This work created an application based on a software architecture where the physics and software concerns are separated in a way that adds flexibility to both. The develop spray-tracking package includes an application programming interface (API) that abstracts away the platform-dependent parallelization concerns, enabling the scientific programmer to write serial code that the API resolves into parallel processes and threads of execution. The project also developed the infrastructure required to provide similar APIs to other application. The API allow object-oriented Fortran applications direct interaction with Trilinos to support memory management of distributed objects in central processing units (CPU) and graphic processing units (GPU) nodes for applications using C++.

  5. A requirements specification for a software design support system

    Science.gov (United States)

    Noonan, Robert E.

    1988-01-01

    Most existing software design systems (SDSS) support the use of only a single design methodology. A good SDSS should support a wide variety of design methods and languages including structured design, object-oriented design, and finite state machines. It might seem that a multiparadigm SDSS would be expensive in both time and money to construct. However, it is proposed that instead an extensible SDSS that directly implements only minimal database and graphical facilities be constructed. In particular, it should not directly implement tools to faciliate language definition and analysis. It is believed that such a system could be rapidly developed and put into limited production use, with the experience gained used to refine and evolve the systems over time.

  6. Artificial Intelligence Software Engineering (AISE) model

    Science.gov (United States)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  7. Maintaining and improving of the training program on the analysis software in CMS

    International Nuclear Information System (INIS)

    Malik, S; Hoehle, F; Lassila-Perini, K; Hinzmann, A; Wolf, R; Shipsey, I

    2012-01-01

    Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.

  8. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  9. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  10. A new evolutionary system for evolving artificial neural networks.

    Science.gov (United States)

    Yao, X; Liu, Y

    1997-01-01

    This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP). Unlike most previous studies on evolving ANN's, this paper puts its emphasis on evolving ANN's behaviors. Five mutation operators proposed in EPNet reflect such an emphasis on evolving behaviors. Close behavioral links between parents and their offspring are maintained by various mutations, such as partial training and node splitting. EPNet evolves ANN's architectures and connection weights (including biases) simultaneously in order to reduce the noise in fitness evaluation. The parsimony of evolved ANN's is encouraged by preferring node/connection deletion to addition. EPNet has been tested on a number of benchmark problems in machine learning and ANNs, such as the parity problem, the medical diagnosis problems, the Australian credit card assessment problem, and the Mackey-Glass time series prediction problem. The experimental results show that EPNet can produce very compact ANNs with good generalization ability in comparison with other algorithms.

  11. Migrating C/C++ Software to Mobile Platforms in the ADM Context

    Directory of Open Access Journals (Sweden)

    Liliana Martinez

    2017-03-01

    Full Text Available Software technology is constantly evolving and therefore the development of applications requires adapting software components and applications in order to be aligned to new paradigms such as Pervasive Computing, Cloud Computing and Internet of Things. In particular, many desktop software components need to be migrated to mobile technologies. This migration faces many challenges due to the proliferation of different mobile platforms. Developers usually make applications tailored for each type of device expending time and effort. As a result, new programming languages are emerging to integrate the native behaviors of the different platforms targeted in development projects. In this direction, the Haxe language allows writing mobile applications that target all major mobile platforms. Novel technical frameworks for information integration and tool interoperability such as Architecture-Driven Modernization (ADM proposed by the Object Management Group (OMG can help to manage a huge diversity of mobile technologies. The Architecture-Driven Modernization Task Force (ADMTF was formed to create specifications and promote industry consensus on the modernization of existing applications. In this work, we propose a migration process from C/C++ software to different mobile platforms that integrates ADM standards with Haxe. We exemplify the different steps of the process with a simple case study, the migration of “the Set of Mandelbrot” C++ application. The proposal was validated in Eclipse Modeling Framework considering that some of its tools and run-time environments are aligned with ADM standards.

  12. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  13. Model based development of cruise control for Mercedes-Benz trucks; Modellbasierte Entwicklung eines Tempomat fuer Mercedes-Benz Trucks

    Energy Technology Data Exchange (ETDEWEB)

    Wuensche, M. [VDI, Berlin (Germany); Elser, J.; Dorner, J. [DaimlerChrysler AG, Stuttgart (Germany); Wahner, U.; Kanamueller, B. [MathWorks GmbH, Muenchen (Germany)

    2005-07-01

    It was necessary to reengineer the cruise control of Mercedes-Benz Trucks for its world wide use in commercial vehicles of the DaimlerChrysler AG. For this extensive task a new software development process of model based function development and automatic serial code generation was installed and exemplary used. Key aspects of this process are the involvement of software-in-the-loop and hardware-in-the-loop simulation technologies to ensure a high software quality through the whole cycle. The simulation and modeling tool chain consists of Matlab, Simulink and Embedded Coder, therefore the project was realized under assistance of the consulting department of The MathWorks Inc. (orig.)

  14. EVOLVE 2014 International Conference

    CERN Document Server

    Tantar, Emilia; Sun, Jian-Qiao; Zhang, Wei; Ding, Qian; Schütze, Oliver; Emmerich, Michael; Legrand, Pierrick; Moral, Pierre; Coello, Carlos

    2014-01-01

    This volume encloses research articles that were presented at the EVOLVE 2014 International Conference in Beijing, China, July 1–4, 2014.The book gathers contributions that emerged from the conference tracks, ranging from probability to set oriented numerics and evolutionary computation; all complemented by the bridging purpose of the conference, e.g. Complex Networks and Landscape Analysis, or by the more application oriented perspective. The novelty of the volume, when considering the EVOLVE series, comes from targeting also the practitioner’s view. This is supported by the Machine Learning Applied to Networks and Practical Aspects of Evolutionary Algorithms tracks, providing surveys on new application areas, as in the networking area and useful insights in the development of evolutionary techniques, from a practitioner’s perspective. Complementary to these directions, the conference tracks supporting the volume, follow on the individual advancements of the subareas constituting the scope of the confe...

  15. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  16. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  17. Repository-based software engineering program: Concept document

    Science.gov (United States)

    1992-01-01

    This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.

  18. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  19. Mentoring: An Evolving Relationship.

    Science.gov (United States)

    Block, Michelle; Florczak, Kristine L

    2017-04-01

    The column concerns itself with mentoring as an evolving relationship between mentor and mentee. The collegiate mentoring model, the transformational transcendence model, and the humanbecoming mentoring model are considered in light of a dialogue with mentors at a Midwest university and conclusions are drawn.

  20. Plagiarism in the Context of Education and Evolving Detection Strategies.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Nurmashev, Bekaidar; Seksenbayev, Bakhytzhan; Trukhachev, Vladimir I; Kostyukova, Elena I; Kitas, George D

    2017-08-01

    Plagiarism may take place in any scientific journals despite currently employed anti-plagiarism tools. The absence of widely acceptable definitions of research misconduct and reliance solely on similarity checks do not allow journal editors to prevent most complex cases of recycling of scientific information and wasteful, or 'predatory,' publishing. This article analyses Scopus-based publication activity and evidence on poor writing, lack of related training, emerging anti-plagiarism strategies, and new forms of massive wasting of resources by publishing largely recycled items, which evade the 'red flags' of similarity checks. In some non-Anglophone countries 'copy-and-paste' writing still plagues pre- and postgraduate education. Poor research management, absence of courses on publication ethics, and limited access to quality sources confound plagiarism as a cross-cultural and multidisciplinary phenomenon. Over the past decade, the advent of anti-plagiarism software checks has helped uncover elementary forms of textual recycling across journals. But such a tool alone proves inefficient for preventing complex forms of plagiarism. Recent mass retractions of plagiarized articles by reputable open-access journals point to critical deficiencies of current anti-plagiarism software that do not recognize manipulative paraphrasing and editing. Manipulative editing also finds its way to predatory journals, ignoring the adherence to publication ethics and accommodating nonsense plagiarized items. The evolving preventive strategies are increasingly relying on intelligent (semantic) digital technologies, comprehensively evaluating texts, keywords, graphics, and reference lists. It is the right time to enforce adherence to global editorial guidance and implement a comprehensive anti-plagiarism strategy by helping all stakeholders of scholarly communication. © 2017 The Korean Academy of Medical Sciences.

  1. Nonsynonymous substitution rate (Ka is a relatively consistent parameter for defining fast-evolving and slow-evolving protein-coding genes

    Directory of Open Access Journals (Sweden)

    Wang Lei

    2011-02-01

    Full Text Available Abstract Background Mammalian genome sequence data are being acquired in large quantities and at enormous speeds. We now have a tremendous opportunity to better understand which genes are the most variable or conserved, and what their particular functions and evolutionary dynamics are, through comparative genomics. Results We chose human and eleven other high-coverage mammalian genome data–as well as an avian genome as an outgroup–to analyze orthologous protein-coding genes using nonsynonymous (Ka and synonymous (Ks substitution rates. After evaluating eight commonly-used methods of Ka and Ks calculation, we observed that these methods yielded a nearly uniform result when estimating Ka, but not Ks (or Ka/Ks. When sorting genes based on Ka, we noticed that fast-evolving and slow-evolving genes often belonged to different functional classes, with respect to species-specificity and lineage-specificity. In particular, we identified two functional classes of genes in the acquired immune system. Fast-evolving genes coded for signal-transducing proteins, such as receptors, ligands, cytokines, and CDs (cluster of differentiation, mostly surface proteins, whereas the slow-evolving genes were for function-modulating proteins, such as kinases and adaptor proteins. In addition, among slow-evolving genes that had functions related to the central nervous system, neurodegenerative disease-related pathways were enriched significantly in most mammalian species. We also confirmed that gene expression was negatively correlated with evolution rate, i.e. slow-evolving genes were expressed at higher levels than fast-evolving genes. Our results indicated that the functional specializations of the three major mammalian clades were: sensory perception and oncogenesis in primates, reproduction and hormone regulation in large mammals, and immunity and angiotensin in rodents. Conclusion Our study suggests that Ka calculation, which is less biased compared to Ks and Ka

  2. Software framework for automatic learning of telescope operation

    Science.gov (United States)

    Rodríguez, Jose A.; Molgó, Jordi; Guerra, Dailos

    2016-07-01

    The "Gran Telescopio de Canarias" (GTC) is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). The GTC Control System (GCS) is a distributed object and component oriented system based on RT-CORBA and it is responsible for the operation of the telescope, including its instrumentation. The current development state of GCS is mature and fully operational. On the one hand telescope users as PI's implement the sequences of observing modes of future scientific instruments that will be installed in the telescope and operators, in turn, design their own sequences for maintenance. On the other hand engineers develop new components that provide new functionality required by the system. This great work effort is possible to minimize so that costs are reduced, especially if one considers that software maintenance is the most expensive phase of the software life cycle. Could we design a system that allows the progressive assimilation of sequences of operation and maintenance of the telescope, through an automatic self-programming system, so that it can evolve from one Component oriented organization to a Service oriented organization? One possible way to achieve this is to use mechanisms of learning and knowledge consolidation to reduce to the minimum expression the effort to transform the specifications of the different telescope users to the operational deployments. This article proposes a framework for solving this problem based on the combination of the following tools: data mining, self-Adaptive software, code generation, refactoring based on metrics, Hierarchical Agglomerative Clustering and Service Oriented Architectures.

  3. Interactively Evolving Compositional Sound Synthesis Networks

    DEFF Research Database (Denmark)

    Jónsson, Björn Þór; Hoover, Amy K.; Risi, Sebastian

    2015-01-01

    the space of potential sounds that can be generated through such compositional sound synthesis networks (CSSNs). To study the effect of evolution on subjective appreciation, participants in a listener study ranked evolved timbres by personal preference, resulting in preferences skewed toward the first......While the success of electronic music often relies on the uniqueness and quality of selected timbres, many musicians struggle with complicated and expensive equipment and techniques to create their desired sounds. Instead, this paper presents a technique for producing novel timbres that are evolved...

  4. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  5. Evolving effective incremental SAT solvers with GP

    OpenAIRE

    Bader, Mohamed; Poli, R.

    2008-01-01

    Hyper-Heuristics could simply be defined as heuristics to choose other heuristics, and it is a way of combining existing heuristics to generate new ones. In a Hyper-Heuristic framework, the framework is used for evolving effective incremental (Inc*) solvers for SAT. We test the evolved heuristics (IncHH) against other known local search heuristics on a variety of benchmark SAT problems.

  6. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  7. Software R&D for Next Generation of HEP Experiments, Inspired by Theano

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    In the next decade, the frontiers of High Energy Physics (HEP) will be explored by three machines: the High Luminosity Large Hadron Collider (HL-LHC) in Europe, the Long Base Neutrino Facility (LBNF) in the US, and the International Linear Collider (ILC) in Japan. These next generation experiments must address two fundamental problems in the current generation of HEP experimental software: the inability to take advantage and adapt to the rapidly evolving processor landscape, and the difficulty in developing and maintaining increasingly complex software systems by physicists. I will propose a strategy, inspired by the automatic optimization and code generation in Theano, to simultaneously address both problems. I will describe three R&D projects with short-term physics deliverables aimed at developing this strategy. The first project is to develop maximally sensitive General Search for New Physics at the LHC by applying the Matrix Element Method running GPUs of HPCs. The second is to classify and reconstru...

  8. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  9. Why Replacing Legacy Systems Is So Hard in Global Software Development: An Information Infrastructure Perspective

    DEFF Research Database (Denmark)

    Matthiesen, Stina; Bjørn, Pernille

    2015-01-01

    We report on an ethnographic study of an outsourcing global software development (GSD) setup between a Danish IT company and an Indian IT vendor developing a system to replace a legacy system for social services administration in Denmark. Physical distance and GSD collaboration issues tend...... to be obvious explanations for why GSD tasks fail to reach completion; however, we account for the difficulties within the technical nature of software system task. We use the framework of information infrastructure to show how replacing a legacy system in governmental information infrastructures includes...... the work of tracing back to knowledge concerning law, technical specifications, as well as how information infrastructures have dynamically evolved over time. Not easily carried out in a GSD setup is the work around technical tasks that requires careful examination of mundane technical aspects, standards...

  10. Free software, Open source software, licenses. A short presentation including a procedure for research software and data dissemination

    OpenAIRE

    Gomez-Diaz , Teresa

    2014-01-01

    4 pages. Spanish version: Software libre, software de código abierto, licencias. Donde se propone un procedimiento de distribución de software y datos de investigación; The main goal of this document is to help the research community to understand the basic concepts of software distribution: Free software, Open source software, licenses. This document also includes a procedure for research software and data dissemination.

  11. Towards a general object-oriented software development methodology

    Science.gov (United States)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  12. Evolving Intelligent Systems Methodology and Applications

    CERN Document Server

    Angelov, Plamen; Kasabov, Nik

    2010-01-01

    From theory to techniques, the first all-in-one resource for EIS. There is a clear demand in advanced process industries, defense, and Internet and communication (VoIP) applications for intelligent yet adaptive/evolving systems. Evolving Intelligent Systems is the first self- contained volume that covers this newly established concept in its entirety, from a systematic methodology to case studies to industrial applications. Featuring chapters written by leading world experts, it addresses the progress, trends, and major achievements in this emerging research field, with a strong emphasis on th

  13. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    Science.gov (United States)

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  14. Methods Evolved by Observation

    Science.gov (United States)

    Montessori, Maria

    2016-01-01

    Montessori's idea of the child's nature and the teacher's perceptiveness begins with amazing simplicity, and when she speaks of "methods evolved," she is unveiling a methodological system for observation. She begins with the early childhood explosion into writing, which is a familiar child phenomenon that Montessori has written about…

  15. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  16. REKAYASA PERANGKAT LUNAK BERBASIS SUMBER TERBUKA UNTUK MEMBANTU PELAKSANAAN AUDIT SISTEM INFORMASI

    Directory of Open Access Journals (Sweden)

    Hari Setiabudi Husni

    2010-05-01

    Full Text Available This research was conducted on one budget period in 2009 funded by DIKTI young lecturer research project grant. The main research location is Bina Nusantara University. Due to tight research schedule, it was necessary to take some strategic steps to fulfill research goals. One of the strategic steps was to invite several experts in software industry to give advices regarding open source software engineering issues. The first achievement was findings of some open source software that could assist on auditing information systems. Afterwards, comparison from technical and functional aspects resulted in the best software to be tested for implementation and usage, namely ZenossCore. The final output of this research is successful reengineering of the source code for virtual file testing.Keywords: open source software, audit implementation, information system

  17. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  18. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  19. Evolving artificial metalloenzymes via random mutagenesis

    Science.gov (United States)

    Yang, Hao; Swartz, Alan M.; Park, Hyun June; Srivastava, Poonam; Ellis-Guardiola, Ken; Upp, David M.; Lee, Gihoon; Belsare, Ketaki; Gu, Yifan; Zhang, Chen; Moellering, Raymond E.; Lewis, Jared C.

    2018-03-01

    Random mutagenesis has the potential to optimize the efficiency and selectivity of protein catalysts without requiring detailed knowledge of protein structure; however, introducing synthetic metal cofactors complicates the expression and screening of enzyme libraries, and activity arising from free cofactor must be eliminated. Here we report an efficient platform to create and screen libraries of artificial metalloenzymes (ArMs) via random mutagenesis, which we use to evolve highly selective dirhodium cyclopropanases. Error-prone PCR and combinatorial codon mutagenesis enabled multiplexed analysis of random mutations, including at sites distal to the putative ArM active site that are difficult to identify using targeted mutagenesis approaches. Variants that exhibited significantly improved selectivity for each of the cyclopropane product enantiomers were identified, and higher activity than previously reported ArM cyclopropanases obtained via targeted mutagenesis was also observed. This improved selectivity carried over to other dirhodium-catalysed transformations, including N-H, S-H and Si-H insertion, demonstrating that ArMs evolved for one reaction can serve as starting points to evolve catalysts for others.

  20. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  1. Real-time development of data acquisition and analysis software for hands-on physiology education in neuroscience: G-PRIME.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.

  2. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  3. The evolution of resource adaptation: how generalist and specialist consumers evolve.

    Science.gov (United States)

    Ma, Junling; Levin, Simon A

    2006-07-01

    Why and how specialist and generalist strategies evolve are important questions in evolutionary ecology. In this paper, with the method of adaptive dynamics and evolutionary branching, we identify conditions that select for specialist and generalist strategies. Generally, generalist strategies evolve if there is a switching benefit; specialists evolve if there is a switching cost. If the switching cost is large, specialists always evolve. If the switching cost is small, even though the consumer will first evolve toward a generalist strategy, it will eventually branch into two specialists.

  4. Ranking in evolving complex networks

    Science.gov (United States)

    Liao, Hao; Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng; Zhou, Ming-Yang

    2017-05-01

    Complex networks have emerged as a simple yet powerful framework to represent and analyze a wide range of complex systems. The problem of ranking the nodes and the edges in complex networks is critical for a broad range of real-world problems because it affects how we access online information and products, how success and talent are evaluated in human activities, and how scarce resources are allocated by companies and policymakers, among others. This calls for a deep understanding of how existing ranking algorithms perform, and which are their possible biases that may impair their effectiveness. Many popular ranking algorithms (such as Google's PageRank) are static in nature and, as a consequence, they exhibit important shortcomings when applied to real networks that rapidly evolve in time. At the same time, recent advances in the understanding and modeling of evolving networks have enabled the development of a wide and diverse range of ranking algorithms that take the temporal dimension into account. The aim of this review is to survey the existing ranking algorithms, both static and time-aware, and their applications to evolving networks. We emphasize both the impact of network evolution on well-established static algorithms and the benefits from including the temporal dimension for tasks such as prediction of network traffic, prediction of future links, and identification of significant nodes.

  5. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  6. Automated Creation of Datamarts from a Clinical Data Warehouse, Driven by an Active Metadata Repository

    Science.gov (United States)

    Rogerson, Charles L.; Kohlmiller, Paul H.; Stutman, Harris

    1998-01-01

    A methodology and toolkit are described which enable the automated metadata-driven creation of datamarts from clinical data warehouses. The software uses schema-to-schema transformation driven by an active metadata repository. Tools for assessing datamart data quality are described, as well as methods for assessing the feasibility of implementing specific datamarts. A methodology for data remediation and the re-engineering of operational data capture is described.

  7. Adaptation of Escherichia coli to glucose promotes evolvability in lactose.

    Science.gov (United States)

    Phillips, Kelly N; Castillo, Gerardo; Wünsche, Andrea; Cooper, Tim F

    2016-02-01

    The selective history of a population can influence its subsequent evolution, an effect known as historical contingency. We previously observed that five of six replicate populations that were evolved in a glucose-limited environment for 2000 generations, then switched to lactose for 1000 generations, had higher fitness increases in lactose than populations started directly from the ancestor. To test if selection in glucose systematically increased lactose evolvability, we started 12 replay populations--six from a population subsample and six from a single randomly selected clone--from each of the six glucose-evolved founder populations. These replay populations and 18 ancestral populations were evolved for 1000 generations in a lactose-limited environment. We found that replay populations were initially slightly less fit in lactose than the ancestor, but were more evolvable, in that they increased in fitness at a faster rate and to higher levels. This result indicates that evolution in the glucose environment resulted in genetic changes that increased the potential of genotypes to adapt to lactose. Genome sequencing identified four genes--iclR, nadR, spoT, and rbs--that were mutated in most glucose-evolved clones and are candidates for mediating increased evolvability. Our results demonstrate that short-term selective costs during selection in one environment can lead to changes in evolvability that confer longer term benefits. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  8. Evolving fuzzy rules for relaxed-criteria negotiation.

    Science.gov (United States)

    Sim, Kwang Mong

    2008-12-01

    In the literature on automated negotiation, very few negotiation agents are designed with the flexibility to slightly relax their negotiation criteria to reach a consensus more rapidly and with more certainty. Furthermore, these relaxed-criteria negotiation agents were not equipped with the ability to enhance their performance by learning and evolving their relaxed-criteria negotiation rules. The impetus of this work is designing market-driven negotiation agents (MDAs) that not only have the flexibility of relaxing bargaining criteria using fuzzy rules, but can also evolve their structures by learning new relaxed-criteria fuzzy rules to improve their negotiation outcomes as they participate in negotiations in more e-markets. To this end, an evolutionary algorithm for adapting and evolving relaxed-criteria fuzzy rules was developed. Implementing the idea in a testbed, two kinds of experiments for evaluating and comparing EvEMDAs (MDAs with relaxed-criteria rules that are evolved using the evolutionary algorithm) and EMDAs (MDAs with relaxed-criteria rules that are manually constructed) were carried out through stochastic simulations. Empirical results show that: 1) EvEMDAs generally outperformed EMDAs in different types of e-markets and 2) the negotiation outcomes of EvEMDAs generally improved as they negotiated in more e-markets.

  9. Transforming Our SMEX Organization by Way of Innovation, Standardization, and Automation

    Science.gov (United States)

    Madden, Maureen; Crouse, Pat; Carry, Everett; Esposito, timothy; Parker, Jeffrey; Bradley, David

    2006-01-01

    NASA's Small Explorer (SMEX) Flight Operations Team (FOT) is currently tackling the challenge of supporting ground operations for several satellites that have surpassed their designed lifetime and have a dwindling budget. At Goddard Space Flight Center (GSFC), these missions are presently being reengineered into a fleet-oriented ground system. When complete, this ground system will provide command and control of four SMEX missions, and will demonstrate fleet automation and control concepts as a pathfinder for additional mission integrations. A goal of this reengineering effort is to demonstrate new ground-system technologies that show promise of supporting longer mission lifecycles and simplifying component integration. In pursuit of this goal, the SMEX organization has had to examine standardization, innovation, and automation. A core technology being demonstrated in this effort is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture focuses on providing standard interfaces for ground system applications to promote application interoperability. Building around commercial Message Oriented Middleware and providing a common messaging standard allows GMSEC to provide the capabilities necessary to support integration of new software components into existing missions and increase the level of interaction within the system. For SMS, GMSEC has become the technology platform to transform flight operations with the innovation and automation necessary to reduce operational costs. The automation technologies supported in SMEX are built upon capabilities provided by the GMSEC architecture that allows the FOT to further reduce the involvement of the console, operator. Initially, SMEX is automating only routine operations, such as safety and health monitoring, basic commanding, and system recovery. The operational concepts being developed here will reduce the need for staffed passes and are a necessity for future fleet management. As this

  10. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  11. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    Science.gov (United States)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  12. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  13. Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud

    Science.gov (United States)

    Ramamurthy, Mohan; Fisher, Ward; Yoksas, Tom

    2015-04-01

    Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high expectations from students who have grown up with smartphones and tablets. These changes are upending traditional approaches to accessing and using data and software. Unidata recognizes that its products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable in the form of downloadable Unidata-in-a-box virtual images, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our ongoing efforts to deploy a suite of Unidata data

  14. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  15. A Software Reuse Approach and Its Effect On Software Quality, An Empirical Study for The Software Industry

    OpenAIRE

    Mateen, Ahmed; Kausar, Samina; Sattar, Ahsan Raza

    2017-01-01

    Software reusability has become much interesting because of increased quality and reduce cost. A good process of software reuse leads to enhance the reliability, productivity, quality and the reduction of time and cost. Current reuse techniques focuses on the reuse of software artifact which grounded on anticipated functionality whereas, the non-functional (quality) aspect are also important. So, Software reusability used here to expand quality and productivity of software. It improves overal...

  16. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  17. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  18. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  19. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  20. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  1. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  2. Evolving Procurement Organizations

    DEFF Research Database (Denmark)

    Bals, Lydia; Laine, Jari; Mugurusi, Godfrey

    Procurement has to find further levers and advance its contribution to corporate goals continuously. This places pressure on its organization in order to facilitate its performance. Therefore, procurement organizations constantly have to evolve in order to match these demands. A conceptual model...... and external contingency factors and having a more detailed look at the structural dimensions chosen, beyond the well-known characteristics of centralization, formalization, participation, specialization, standardization and size. From a theoretical perspective, it opens up insights that can be leveraged...

  3. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  4. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    Science.gov (United States)

    Shang, Yilun

    2015-01-01

    Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices.

  5. CORBA technology in reengineering the FTU data acquisition system

    International Nuclear Information System (INIS)

    Bertocchi, A.; Buceti, G.; Centioli, C.; Di Muzio, D.; Iannone, F.; Panella, M.; Vitale, V.

    2002-01-01

    In its early stages, Frascati tokamak upgrade DAS was essentially devoted to acquiring data from experiments in CAMAC standard, using a software system (code and database) entirely written by domestic professionals. In 15 years of life DAS has been growing in size and complexity, still preserving its original structure; at the same time new standards were introduced (VME, PCI) to take into account users' ever increasing demands for amount of data and acquisition frequency with which the existing code couldn't cope. Moreover, machines were getting old and the maintenance became troublesome. Finally, the data archive porting to Unix has definitely shown that the DAS system was ageing and a thorough redesign was needed. The system we are planning to introduce is founded on a standard CORBA bus: (i) to integrate heterogeneous platforms and define a standard layer for interactions between the different acquisition units; (ii) to grant, with open source tools (MySql) and interfaces (Html and Java), unified access to hardware and software configuration data. So, a dedicated PC server, connected via a suitable PCI serial highway driver card, will perform the CAMAC access for all the clients interacting through the CORBA layer. Up to now we have successfully tested CAMAC access, and we designed an acquisition unit, which will be the building block of the new system. The next step will be migrating to Alpha/VMS the software related to CAMAC data acquisition, which has been so far the cornerstone of the whole DAS; it will be completely redesigned to fit the 'acquisition unit' paradigm we have defined. Finally we will have a fully distributed data acquisition system with VME (at present six such units have been operating since 1999) and PCI stations, an Alpha/VMS client of the CAMAC/PC server and any possible platform interacting through a CORBA bus for getting data configuration, synchronisation and data archiving

  6. CORBA technology in reengineering the FTU data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Bertocchi, A; Buceti, G; Centioli, C; Di Muzio, D; Iannone, F.; Panella, M; Vitale, V

    2002-06-01

    In its early stages, Frascati tokamak upgrade DAS was essentially devoted to acquiring data from experiments in CAMAC standard, using a software system (code and database) entirely written by domestic professionals. In 15 years of life DAS has been growing in size and complexity, still preserving its original structure; at the same time new standards were introduced (VME, PCI) to take into account users' ever increasing demands for amount of data and acquisition frequency with which the existing code couldn't cope. Moreover, machines were getting old and the maintenance became troublesome. Finally, the data archive porting to Unix has definitely shown that the DAS system was ageing and a thorough redesign was needed. The system we are planning to introduce is founded on a standard CORBA bus: (i) to integrate heterogeneous platforms and define a standard layer for interactions between the different acquisition units; (ii) to grant, with open source tools (MySql) and interfaces (Html and Java), unified access to hardware and software configuration data. So, a dedicated PC server, connected via a suitable PCI serial highway driver card, will perform the CAMAC access for all the clients interacting through the CORBA layer. Up to now we have successfully tested CAMAC access, and we designed an acquisition unit, which will be the building block of the new system. The next step will be migrating to Alpha/VMS the software related to CAMAC data acquisition, which has been so far the cornerstone of the whole DAS; it will be completely redesigned to fit the 'acquisition unit' paradigm we have defined. Finally we will have a fully distributed data acquisition system with VME (at present six such units have been operating since 1999) and PCI stations, an Alpha/VMS client of the CAMAC/PC server and any possible platform interacting through a CORBA bus for getting data configuration, synchronisation and data archiving.

  7. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  8. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  9. Application of Domain Knowledge to Software Quality Assurance

    Science.gov (United States)

    Wild, Christian W.

    1997-01-01

    This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted

  10. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    Directory of Open Access Journals (Sweden)

    Yilun Shang

    Full Text Available Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices.

  11. Software testability and its application to avionic software

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  12. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  13. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  14. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  15. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  16. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Science.gov (United States)

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  17. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Directory of Open Access Journals (Sweden)

    Agustín Zaballos

    2018-02-01

    Full Text Available Information and communication technologies (ICTs have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  18. Calidad del software: camino hacia una verdadera industria del software

    Directory of Open Access Journals (Sweden)

    Saulo Ernesto Rojas Salamanca

    1999-07-01

    Full Text Available El software es quizá uno de los productos de la ingeniería que más ha evolucionado en muy poco tiempo, pasando desde el software empírico o artesanal hasta llegar al software desarrollado bajo los principios y herramientas de la ingeniería del software. Sin embargo, dentro de estos cambios, las personas encargadas de la elaboración del software se han enfrentado a problemas muy comunes: unos debido a la exigencia cada vez mayor en la capacidad de resultados del software, debido al permanente cambio de condiciones lo que aumenta su complejidad y obsolescencia; y otros, debido a la carencia de herramientas adecuadas y estándares de tipo organizacional encaminados al mejoramiento de los procesos en el desarrollo del software. Hacia la búsqueda de mecanismos de solución de estos últimos problemas se orienta este artículo...

  19. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    Science.gov (United States)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  20. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  1. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  2. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  3. Evolvable mathematical models: A new artificial Intelligence paradigm

    Science.gov (United States)

    Grouchy, Paul

    We develop a novel Artificial Intelligence paradigm to generate autonomously artificial agents as mathematical models of behaviour. Agent/environment inputs are mapped to agent outputs via equation trees which are evolved in a manner similar to Symbolic Regression in Genetic Programming. Equations are comprised of only the four basic mathematical operators, addition, subtraction, multiplication and division, as well as input and output variables and constants. From these operations, equations can be constructed that approximate any analytic function. These Evolvable Mathematical Models (EMMs) are tested and compared to their Artificial Neural Network (ANN) counterparts on two benchmarking tasks: the double-pole balancing without velocity information benchmark and the challenging discrete Double-T Maze experiments with homing. The results from these experiments show that EMMs are capable of solving tasks typically solved by ANNs, and that they have the ability to produce agents that demonstrate learning behaviours. To further explore the capabilities of EMMs, as well as to investigate the evolutionary origins of communication, we develop NoiseWorld, an Artificial Life simulation in which interagent communication emerges and evolves from initially noncommunicating EMM-based agents. Agents develop the capability to transmit their x and y position information over a one-dimensional channel via a complex, dialogue-based communication scheme. These evolved communication schemes are analyzed and their evolutionary trajectories examined, yielding significant insight into the emergence and subsequent evolution of cooperative communication. Evolved agents from NoiseWorld are successfully transferred onto physical robots, demonstrating the transferability of EMM-based AIs from simulation into physical reality.

  4. Possibilities and limitations of applying software reliability growth models to safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2007-01-01

    It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

  5. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  6. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  7. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  8. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  9. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  10. Addressing Software Engineering Issues in Real-Time Software ...

    African Journals Online (AJOL)

    Addressing Software Engineering Issues in Real-Time Software ... systems, manufacturing process, process control, military, space exploration, and ... but also physical properties such as timeliness, Quality of Service and reliability.

  11. GPU Based Software Correlators - Perspectives for VLBI2010

    Science.gov (United States)

    Hobiger, Thomas; Kimura, Moritaka; Takefuji, Kazuhiro; Oyama, Tomoaki; Koyama, Yasuhiro; Kondo, Tetsuro; Gotoh, Tadahiro; Amagai, Jun

    2010-01-01

    Caused by historical separation and driven by the requirements of the PC gaming industry, Graphics Processing Units (GPUs) have evolved to massive parallel processing systems which entered the area of non-graphic related applications. Although a single processing core on the GPU is much slower and provides less functionality than its counterpart on the CPU, the huge number of these small processing entities outperforms the classical processors when the application can be parallelized. Thus, in recent years various radio astronomical projects have started to make use of this technology either to realize the correlator on this platform or to establish the post-processing pipeline with GPUs. Therefore, the feasibility of GPUs as a choice for a VLBI correlator is being investigated, including pros and cons of this technology. Additionally, a GPU based software correlator will be reviewed with respect to energy consumption/GFlop/sec and cost/GFlop/sec.

  12. PCBA demand forecasting using an evolving Takagi-Sugeno system

    NARCIS (Netherlands)

    van Rooijen, M.; Almeida, R.J.; Kaymak, U.

    2016-01-01

    This paper investigates the use of using an evolving fuzzy system for printed circuit board (PCBA) demand forecasting. The algorithm is based on the evolving Takagi-Sugeno (eTS) fuzzy system, which has the ability to incorporate new patterns by changing its internal structure in an on-line fashion.

  13. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  14. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  15. CMIP6 Data Citation of Evolving Data

    Directory of Open Access Journals (Sweden)

    Martina Stockhause

    2017-06-01

    Full Text Available Data citations have become widely accepted. Technical infrastructures as well as principles and recommendations for data citation are in place but best practices or guidelines for their implementation are not yet available. On the other hand, the scientific climate community requests early citations on evolving data for credit, e.g. for CMIP6 (Coupled Model Intercomparison Project Phase 6. The data citation concept for CMIP6 is presented. The main challenges lie in limited resources, a strict project timeline and the dependency on changes of the data dissemination infrastructure ESGF (Earth System Grid Federation to meet the data citation requirements. Therefore a pragmatic, flexible and extendible approach for the CMIP6 data citation service was developed, consisting of a citation for the full evolving data superset and a data cart approach for citing the concrete used data subset. This two citation approach can be implemented according to the RDA recommendations for evolving data. Because of resource constraints and missing project policies, the implementation of the second part of the citation concept is postponed to CMIP7.

  16. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    Science.gov (United States)

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  17. Marshal: Maintaining Evolving Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — SIFT proposes to design and develop the Marshal system, a mixed-initiative tool for maintaining task models over the course of evolving missions. Marshal-enabled...

  18. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  19. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  20. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  1. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  2. Software maintenance and evolution and automated software engineering

    NARCIS (Netherlands)

    Carver, Jeffrey C.; Serebrenik, Alexander

    2018-01-01

    This issue's column reports on the 33rd International Conference on Software Maintenance and Evolution and 32nd International Conference on Automated Software Engineering. Topics include flaky tests, technical debt, QA bots, and regular expressions.

  3. Orthogonally Evolved AI to Improve Difficulty Adjustment in Video Games

    DEFF Research Database (Denmark)

    Hintze, Arend; Olson, Randal; Lehman, Joel Anthony

    2016-01-01

    Computer games are most engaging when their difficulty is well matched to the player's ability, thereby providing an experience in which the player is neither overwhelmed nor bored. In games where the player interacts with computer-controlled opponents, the difficulty of the game can be adjusted...... not only by changing the distribution of opponents or game resources, but also through modifying the skill of the opponents. Applying evolutionary algorithms to evolve the artificial intelligence that controls opponent agents is one established method for adjusting opponent difficulty. Less-evolved agents...... (i.e. agents subject to fewer generations of evolution) make for easier opponents, while highly-evolved agents are more challenging to overcome. In this publication we test a new approach for difficulty adjustment in games: orthogonally evolved AI, where the player receives support from collaborating...

  4. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  5. Impact of Internet of Things on Software Business Model and Software Industry

    OpenAIRE

    Murari, Bhanu Teja

    2016-01-01

    Context: Internet of things (IoT) technology is rapidly increasing and changes the business environment for a software organization. There is a need to understand what are important factors of business model should a software company focus on obtaining benefits from the potential that IoT offers. This thesis also focuses on finding the impact of IoT on software business model and software industry especially on software development. Objectives: In this thesis, we do research on IoT software b...

  6. EPRI's POWERCOACH trademark software development project

    International Nuclear Information System (INIS)

    Rost, S.; Leu, Kehshiou

    1993-01-01

    Today's complex bulk power market accounts for an estimated $35 billion in transactions a year, significantly more than a decade ago. With the increased levels of non-utility generation and changing strategies in the utility industry, it is anticipated that the trend toward rapid growth in the bulk power market will continue. This market has evolved from an ad hoc residual market to one that in some respects stands at par with the retail market in the plans of many utilities. The bulk power market is not based on the obligation to serve to the same extent as retail markets. Utility participation in this market is therefore purely voluntary. This freedom of action or inaction in the bulk power market actually renders corporate decision-making, investment related or operational, more complicated in many respects than in retail markets. Examples of the burgeoning uncertainties affecting the bulk power market include the rapid expansion of transactions undertaken through power pools, and the impact on utility planning and operations brought about by the abundance and price attractiveness of power available for flexible periods. These uncertainties present an ideal opportunity to employ state-of-the-art analytical models to facilitate the effective use of utility assets to foster the efficient functioning of the entire bulk power market. This paper will focus on the POWERCOACH methodology for short-term bulk power transaction analysis under conditions of uncertainty. In August 1992, UPMP began a seventeen month project to convert POWERCOACH from a methodology to a fully functional, commercial software package. UPMP is developing the POWERCOACH software with the extensive, direct involvement of thirty EPRI member utilities. A synopsis of POWERCOACH is presented

  7. The Software Invention Cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube

  8. Computational Genetic Regulatory Networks Evolvable, Self-organizing Systems

    CERN Document Server

    Knabe, Johannes F

    2013-01-01

    Genetic Regulatory Networks (GRNs) in biological organisms are primary engines for cells to enact their engagements with environments, via incessant, continually active coupling. In differentiated multicellular organisms, tremendous complexity has arisen in the course of evolution of life on earth. Engineering and science have so far achieved no working system that can compare with this complexity, depth and scope of organization. Abstracting the dynamics of genetic regulatory control to a computational framework in which artificial GRNs in artificial simulated cells differentiate while connected in a changing topology, it is possible to apply Darwinian evolution in silico to study the capacity of such developmental/differentiated GRNs to evolve. In this volume an evolutionary GRN paradigm is investigated for its evolvability and robustness in models of biological clocks, in simple differentiated multicellularity, and in evolving artificial developing 'organisms' which grow and express an ontogeny starting fr...

  9. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  10. Adaptive inferential sensors based on evolving fuzzy models.

    Science.gov (United States)

    Angelov, Plamen; Kordon, Arthur

    2010-04-01

    A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible inferential sensors to recalibrate automatically, which reduces significantly the life-cycle efforts for their maintenance. This is achieved by the adaptive and flexible open-structure EFM used. The novelty of this paper lies in the following: (1) the overall concept of inferential sensors with evolving and self-developing structure from the data streams; (2) the new methodology for online automatic selection of input variables that are most relevant for the prediction; (3) the technique to detect automatically a shift in the data pattern using the age of the clusters (and fuzzy rules); (4) the online standardization technique used by the learning procedure of the evolving model; and (5) the application of this innovative approach to several real-life industrial processes from the chemical industry (evolving inferential sensors, namely, eSensors, were used for predicting the chemical properties of different products in The Dow Chemical Company, Freeport, TX). It should be noted, however, that the methodology and conclusions of this paper are valid for the broader area of chemical and process industries in general. The results demonstrate that well-interpretable and with-simple-structure inferential sensors can automatically be designed from the data stream in real time, which predict various process variables of interest. The proposed approach can be used as a basis for the development of a new generation of adaptive and evolving inferential sensors that can address the

  11. An Evolving Asymmetric Game for Modeling Interdictor-Smuggler Problems

    Science.gov (United States)

    2016-06-01

    ASYMMETRIC GAME FOR MODELING INTERDICTOR-SMUGGLER PROBLEMS by Richard J. Allain June 2016 Thesis Advisor: David L. Alderson Second Reader: W...DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE AN EVOLVING ASYMMETRIC GAME FOR MODELING INTERDICTOR- SMUGGLER PROBLEMS 5. FUNDING NUMBERS 6...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited AN EVOLVING

  12. Experience with highly-parallel software for the storage system of the ATLAS Experiment at CERN

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment is observing proton-proton collisions delivered by the LHC accelerator. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel parallel software design. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, especially the recently introduced event compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we report on the design of the new ATLAS on-line storage software. In particular we will discuss our development experience using recent concurrency-ori...

  13. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  14. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  15. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2006-01-01

    As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

  16. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  17. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  18. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  19. Factors that motivate software developers in Nigerian's software ...

    African Journals Online (AJOL)

    It was also observed those courtesy, good reward systems, regular training, recognition, tolerance of mistakes and good leadership were high motivators of software developers. Keywords: Software developers, information technology, project managers, Nigeria International Journal of Natural and Applied Sciences, 6(4): ...

  20. Developing a Cyberinfrastructure for integrated assessments of environmental contaminants.

    Science.gov (United States)

    Kaur, Taranjit; Singh, Jatinder; Goodale, Wing M; Kramar, David; Nelson, Peter

    2005-03-01

    The objective of this study was to design and implement prototype software for capturing field data and automating the process for reporting and analyzing the distribution of mercury. The four phase process used to design, develop, deploy and evaluate the prototype software is described. Two different development strategies were used: (1) design of a mobile data collection application intended to capture field data in a meaningful format and automate transfer into user databases, followed by (2) a re-engineering of the original software to develop an integrated database environment with improved methods for aggregating and sharing data. Results demonstrated that innovative use of commercially available hardware and software components can lead to the development of an end-to-end digital cyberinfrastructure that captures, records, stores, transmits, compiles and integrates multi-source data as it relates to mercury.

  1. Software for Optimizing Quality Assurance of Other Software

    Science.gov (United States)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  2. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  3. Symbiotic Composition and Evolvability

    OpenAIRE

    Watson, Richard A.; Pollack, Jordan B.

    2001-01-01

    Several of the Major Transitions in natural evolution, such as the symbiogenic origin of eukaryotes from prokaryotes, share the feature that existing entities became the components of composite entities at a higher level of organisation. This composition of pre-adapted extant entities into a new whole is a fundamentally different source of variation from the gradual accumulation of small random variations, and it has some interesting consequences for issues of evolvability. In this paper we p...

  4. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  5. Software To Go: A Catalog of Software Available for Loan.

    Science.gov (United States)

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  6. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  7. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  8. Software quality assurance and software safety in the Biomed Control System

    International Nuclear Information System (INIS)

    Singh, R.P.; Chu, W.T.; Ludewigt, B.A.; Marks, K.M.; Nyman, M.A.; Renner, T.R.; Stradtner, R.

    1989-01-01

    The Biomed Control System is a hardware/software system used for the delivery, measurement and monitoring of heavy-ion beams in the patient treatment and biology experiment rooms in the Bevalac at the Lawrence Berkeley Laboratory (LBL). This paper describes some aspects of this system including historical background philosophy, configuration management, hardware features that facilitate software testing, software testing procedures, the release of new software quality assurance, safety and operator monitoring. 3 refs

  9. Software development and maintenance: An approach for a large accelerator control system

    International Nuclear Information System (INIS)

    Casalegno, L.; Orsini, L.; Sicard, C.H.

    1990-01-01

    Maintenance costs presently form a large part of the total life-cycle cost of a software system. In case of large systems, while the costs of eliminating bugs, fixing analysis and design errors and introducing updates must be taken into account, the coherence of the system as a whole must be maintained while its parts are evolving independently. The need to devise and supply tools to aid programmers in housekeeping and updating has been strongly felt in the case of the LEP preinjector control system. A set of utilities has been implemented to create a safe interface between the programmers and the files containing the control software. Through this interface consistent naming schemes, common compiling and object-building procedures can be enforced, so that development and maintenance staff need not be concerned with the details of executable code generation. Procedures have been built to verify the consistency, generate maintenance diagnostics and automatically update object and executable files, taking into account multiple releases and versions. The tools and the techniques reported in this paper are of general use in the UNIX environment and have already been adopted for other projects. (orig.)

  10. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  11. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  12. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  13. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  14. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  15. A Software Reference Architecture for Service-Oriented 3D Geovisualization Systems

    Directory of Open Access Journals (Sweden)

    Dieter Hildebrandt

    2014-12-01

    Full Text Available Modern 3D geovisualization systems (3DGeoVSs are complex and evolving systems that are required to be adaptable and leverage distributed resources, including massive geodata. This article focuses on 3DGeoVSs built based on the principles of service-oriented architectures, standards and image-based representations (SSI to address practically relevant challenges and potentials. Such systems facilitate resource sharing and agile and efficient system construction and change in an interoperable manner, while exploiting images as efficient, decoupled and interoperable representations. The software architecture of a 3DGeoVS and its underlying visualization model have strong effects on the system’s quality attributes and support various system life cycle activities. This article contributes a software reference architecture (SRA for 3DGeoVSs based on SSI that can be used to design, describe and analyze concrete software architectures with the intended primary benefit of an increase in effectiveness and efficiency in such activities. The SRA integrates existing, proven technology and novel contributions in a unique manner. As the foundation for the SRA, we propose the generalized visualization pipeline model that generalizes and overcomes expressiveness limitations of the prevalent visualization pipeline model. To facilitate exploiting image-based representations (IReps, the SRA integrates approaches for the representation, provisioning and styling of and interaction with IReps. Five applications of the SRA provide proofs of concept for the general applicability and utility of the SRA. A qualitative evaluation indicates the overall suitability of the SRA, its applications and the general approach of building 3DGeoVSs based on SSI.

  16. Software measurement standards for areal surface texture parameters: part 2—comparison of software

    International Nuclear Information System (INIS)

    Harris, P M; Smith, I M; Giusca, C; Leach, R K; Wang, C

    2012-01-01

    A companion paper in this issue describes reference software for the evaluation of areal surface texture parameters, focusing on the definitions of the parameters and giving details of the numerical algorithms employed in the software to implement those definitions. The reference software is used as a benchmark against which software in a measuring instrument can be compared. A data set is used as input to both the software under test and the reference software, and the results delivered by the software under test are compared with those provided by the reference software. This paper presents a comparison of the results returned by the reference software with those reported by proprietary software for surface texture measurement. Differences between the results can be used to identify where algorithms and software for evaluating the parameters differ. They might also be helpful in identifying where parameters are not sufficiently well-defined in standards. (paper)

  17. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  18. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  19. Software Engineering Laboratory Series: Proceedings of the Twentieth Annual Software Engineering Workshop

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  20. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects