WorldWideScience

Sample records for model-driven software evolution

  1. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  2. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    Science.gov (United States)

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  3. Model-driven software engineering

    NARCIS (Netherlands)

    Amstel, van M.F.; Brand, van den M.G.J.; Protic, Z.; Verhoeff, T.; Hamberg, R.; Verriet, J.

    2014-01-01

    Software plays an important role in designing and operating warehouses. However, traditional software engineering methods for designing warehouse software are not able to cope with the complexity, size, and increase of automation in modern warehouses. This chapter describes Model-Driven Software

  4. Aspect-Oriented Model-Driven Software Product Line Engineering

    Science.gov (United States)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  5. Envisioning the future of collaborative model-driven software engineering

    NARCIS (Netherlands)

    Di Ruscio, Davide; Franzago, Mirco; Malavolta, Ivano; Muccini, Henry

    2017-01-01

    The adoption of Model-driven Software Engineering (MDSE) to develop complex software systems in application domains like automotive and aerospace is being supported by the maturation of model-driven platforms and tools. However, empirical studies show that a wider adoption of MDSE technologies is

  6. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  7. Consistent Evolution of Software Artifacts and Non-Functional Models

    Science.gov (United States)

    2014-11-14

    induce bad software performance)? 15. SUBJECT TERMS EOARD, Nano particles, Photo-Acoustic Sensors, Model-Driven Engineering ( MDE ), Software Performance...Università degli Studi dell’Aquila, Via Vetoio, 67100 L’Aquila, Italy Email: vittorio.cortellessa@univaq.it Web : http: // www. di. univaq. it/ cortelle/ Phone...Model-Driven Engineering ( MDE ), Software Performance Engineering (SPE), Change Propagation, Performance Antipatterns. For sake of readability of the

  8. Model Driven Software Development for Agricultural Robotics

    DEFF Research Database (Denmark)

    Larsen, Morten

    The design and development of agricultural robots, consists of both mechan- ical, electrical and software components. All these components must be de- signed and combined such that the overall goal of the robot is fulfilled. The design and development of these systems require collaboration between...... processing, control engineering, etc. This thesis proposes a Model-Driven Software Develop- ment based approach to model, analyse and partially generate the software implementation of a agricultural robot. Furthermore, Guidelines for mod- elling the architecture of an agricultural robots are provided......, assisting with bridging the different engineering disciplines. Timing play an important role in agricultural robotic applications, synchronisation of robot movement and implement actions is important in order to achieve precision spraying, me- chanical weeding, individual feeding, etc. Discovering...

  9. Way of Working for Embedded Control Software using Model-Driven Development Techniques

    NARCIS (Netherlands)

    Bezemer, M.M.; Groothuis, M.A.; Brugali, D.; Schlegel, C.; Broenink, Johannes F.

    2011-01-01

    Embedded targets normally do not have much resources to aid developing and debugging the software. So model-driven development (MDD) is used for designing embedded software with a `first time right' approach. For such an approach, a good way of working (WoW) is required for embedded software

  10. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.

    2015-01-01

    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  11. Model Driven Engineering

    Science.gov (United States)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.

  12. A Pattern Language for the Evolution of Component-based Software Architectures

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Jamshidi, Pooyan; Pahl, Claus

    2013-01-01

    Modern software systems are prone to a continuous evolution under frequently varying requirements. Architecture-centric software evolution enables change in system’s structure and behavior while maintaining a global view of the software to address evolution-centric tradeoffs. Lehman’s law...... evolution problems. We propose that architectural evolution process requires an explicit evolution-centric knowledge – that can be discovered, shared, and reused – to anticipate and guide change management. Therefore, we present a pattern language as a collection of interconnected change patterns......) as a complementary and integrated phase to facilitate reuse-driven architecture change execution (pattern language application). Reuse-knowledge in the proposed pattern language is expressed as a formalised collection of interconnected-patterns. Individual patterns in the language build on each other to facilitate...

  13. Architecture-driven Migration of Legacy Systems to Cloud-enabled Software

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Babar, Muhammad Ali

    2014-01-01

    of legacy systems to cloud computing. The framework leverages the software reengineering concepts that aim to recover the architecture from legacy source code. Then the framework exploits the software evolution concepts to support architecture-driven migration of legacy systems to cloud-based architectures....... The Legacy-to-Cloud Migration Horseshoe comprises of four processes: (i) architecture migration planning, (ii) architecture recovery and consistency, (iii) architecture transformation and (iv) architecture-based development of cloud-enabled software. We aim to discover, document and apply the migration...

  14. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  15. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  16. Economics-driven software architecture

    CERN Document Server

    Mistrik, Ivan; Kazman, Rick; Zhang, Yuanyuan

    2014-01-01

    Economics-driven Software Architecture presents a guide for engineers and architects who need to understand the economic impact of architecture design decisions: the long term and strategic viability, cost-effectiveness, and sustainability of applications and systems. Economics-driven software development can increase quality, productivity, and profitability, but comprehensive knowledge is needed to understand the architectural challenges involved in dealing with the development of large, architecturally challenging systems in an economic way. This book covers how to apply economic consider

  17. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  18. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  19. Modeling Software Evolution using Algebraic Graph Rewriting

    NARCIS (Netherlands)

    Ciraci, Selim; van den Broek, Pim

    We show how evolution requests can be formalized using algebraic graph rewriting. In particular, we present a way to convert the UML class diagrams to colored graphs. Since changes in software may effect the relation between the methods of classes, our colored graph representation also employs the

  20. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  1. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  2. Constraint driven software design: an escape from the waterfall model

    NARCIS (Netherlands)

    de Hoog, Robert; de Jong, Anthonius J.M.; de Vries, Frits

    1994-01-01

    This paper presents the principles of a development methodology for software design. The methodology is based on a nonlinear, product-driven approach that integrates quality aspects. The principles are made more concrete in two examples: one for developing educational simulations and one for

  3. Software evolution with XVCL

    DEFF Research Database (Denmark)

    Zhang, Weishan; Jarzabek, Stan; Zhang, Hongyu

    2004-01-01

    This chapter introduces software evolution with XVCL (XML-based Variant Configuration Language), which is an XML-based metaprogramming technique. As the software evolves, a large number of variants may arise, especially whtn such kinds of evolutions are related to multiple platforms as shown in our...... case study. Handling variants and tracing the impact of variants across the development lifecycle is a challenge. This chapter shows how we can maintain different versions of software in a reuse-based way....

  4. A Constructive Approach To Software Evolution

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Aksit, Mehmet

    2007-01-01

    In many software design and evaluation techniques, either the software evolution problem is not systematically elaborated, or only the impact of evolution is considered. Thus, most of the time software is changed by editing the components of the software system, i.e. breaking down the software

  5. Constraint-Driven Software Design: An Escape from the Waterfall Model.

    Science.gov (United States)

    de Hoog, Robert; And Others

    1994-01-01

    Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…

  6. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  7. Software maintenance and evolution and automated software engineering

    NARCIS (Netherlands)

    Carver, Jeffrey C.; Serebrenik, Alexander

    2018-01-01

    This issue's column reports on the 33rd International Conference on Software Maintenance and Evolution and 32nd International Conference on Automated Software Engineering. Topics include flaky tests, technical debt, QA bots, and regular expressions.

  8. Semantic Web and Model-Driven Engineering

    CERN Document Server

    Parreiras, Fernando S

    2012-01-01

    The next enterprise computing era will rely on the synergy between both technologies: semantic web and model-driven software development (MDSD). The semantic web organizes system knowledge in conceptual domains according to its meaning. It addresses various enterprise computing needs by identifying, abstracting and rationalizing commonalities, and checking for inconsistencies across system specifications. On the other side, model-driven software development is closing the gap among business requirements, designs and executables by using domain-specific languages with custom-built syntax and se

  9. Constraint driven software design: an escape from the waterfall model

    OpenAIRE

    de Hoog, Robert; de Jong, Anthonius J.M.; de Vries, Frits

    1994-01-01

    This paper presents the principles of a development methodology for software design. The methodology is based on a nonlinear, product-driven approach that integrates quality aspects. The principles are made more concrete in two examples: one for developing educational simulations and one for developing expert systems. It is shown that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product and tools are closely attune...

  10. Aligning Requirements-Driven Software Processes with IT Governance

    OpenAIRE

    Nguyen Huynh Anh, Vu; Kolp, Manuel; Heng, Samedi; Wautelet, Yves

    2017-01-01

    Requirements Engineering is closely intertwined with Information Technology (IT) Governance. Aligning IT Governance principles with Requirements-Driven Software Processes allows them to propose governance and management rules for software development to cope with stakeholders’ requirements and expectations. Typically, the goal of IT Governance in software engineering is to ensure that the results of a software organization business processes meet the strategic requirements of the organization...

  11. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  12. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    Science.gov (United States)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  13. Test-driven modeling of embedded systems

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2015-01-01

    To benefit maximally from model-based systems engineering (MBSE) trustworthy high quality models are required. From the software disciplines it is known that test-driven development (TDD) can significantly increase the quality of the products. Using a test-driven approach with MBSE may have...... a similar positive effect on the quality of the system models and the resulting products and may therefore be desirable. To define a test-driven model-based systems engineering (TD-MBSE) approach, we must define this approach for numerous sub disciplines such as modeling of requirements, use cases...... suggest that our method provides a sound foundation for rapid development of high quality system models....

  14. Static Checking of Interrupt-driven Software

    DEFF Research Database (Denmark)

    Brylow, Dennis; Damgaard, Niels; Palsberg, Jens

    2001-01-01

    at the assembly level. In this paper we present the design and implementation of a static checker for interrupt-driven Z86-based software with hard real-time requirements. For six commercial microcontrollers, our checker has produced upper bounds on interrupt latencies and stack sizes, as well as verified...

  15. The evolution of CACSD tools-a software engineering perspective

    DEFF Research Database (Denmark)

    Ravn, Ole; Szymkat, Maciej

    1992-01-01

    The earlier evolution of computer-aided control system design (CACSD) tools is discussed from a software engineering perspective. A model of the design process is presented as the basis for principles and requirements of future CACSD tools. Combinability, interfacing in memory, and an open...... workspace are seen as important concepts in CACSD. Some points are made about the problem of buy or make when new software is required, and the idea of buy and make is put forward. Emphasis is put on the time perspective and the life cycle of the software...

  16. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  17. Model Driven Architecture: Foundations and Applications

    NARCIS (Netherlands)

    Rensink, Arend

    The OMG's Model Driven Architecture (MDA) initiative has been the focus of much attention in both academia and industry, due to its promise of more rapid and consistent software development through the increased use of models. In order for MDA to reach its full potential, the ability to manipulate

  18. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    Science.gov (United States)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  19. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  20. A flexible modelling software for data acquisition

    International Nuclear Information System (INIS)

    Shu Yantai; Chen Yanhui; Yang Songqi; Liu Genchen

    1992-03-01

    A flexible modelling software for data acquisition is based on an event-driven simulator. It can be used to simulate a wide variety of systems which can be modelled as open queuing networks. The main feature of the software is its flexibility to evaluate the performance of various data acquisition system, whether pulsed or not. The flexible features of this software as follow: The user can choose the number of processors in the model and the route which every job takes to move the model. the service rate of a processor is automatically adapted. The simulator has a pipe-line mechanism. A job can be divided into several segments and a processor may be used as a compression component etc. Some modelling techniques and applications of this software in plasma physics laboratories are also presented

  1. Quality management using model-driven engineering: an overview

    OpenAIRE

    Ruiz-Rube, Iván; Escalona, María José

    2014-01-01

    Quality Management (QM) is one of the critical points of any software development process. In recent years, several proposals have emerged on this issue, mainly with regard to maturity models, quality standards and best practices collections. Besides, Model Driven Engineering (MDE) aims to build software systems through the construction and transformation of models. However, MDE might be used for other different tasks. In this poster, we summarize the main contributions abou...

  2. Evaluation of Model Driven Development of Safety Critical Software in the Nuclear Power Plant I and C system

    International Nuclear Information System (INIS)

    Jung, Jae Cheon; Chang, Hoon Seon; Chang, Young Woo; Kim, Jae Hack; Sohn, Se Do

    2005-01-01

    The major issues of the safety critical software are formalism and V and V. Implementing these two characteristics in the safety critical software will greatly enhance the quality of software product. The structure based development requires lots of output documents from the requirements phase to the testing phase. The requirements analysis phase is open omitted. According to the Standish group report in 2001, 49% of software project is cancelled before completion or never implemented. In addition, 23% is completed and become operational, but over-budget, over the time estimation, and with fewer features and functions than initially specified. They identified ten success factors. Among them, firm basic requirements and formal methods are technically achievable factors while the remaining eight are management related. Misunderstanding of requirements due to lack of communication between the design engineer and verification engineer causes unexpected result such as functionality error of system. Safety critical software shall comply with such characteristics as; modularity, simplicity, minimizing the sub-routine, and excluding the interrupt routine. In addition, the crosslink fault and erroneous function shall be eliminated. The easiness of repairing work after the installation shall be achieved as well. In consideration of the above issues, we evaluate the model driven development (MDD) methods for nuclear I and C systems software. For qualitative analysis, the unified modeling language (UML), functional block language (FBL) and the safety critical application environment (SCADE) are tested for the above characteristics

  3. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  4. A File Based Visualization of Software Evolution

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.

    2006-01-01

    Software Configuration Management systems are important instruments for supporting development of large software projects. They accumulate large amounts of evolution data that can be used for process accounting and auditing. We study how visualization can help developers and managers to get insight

  5. NASA's Advanced Multimission Operations System: A Case Study in Formalizing Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2011-01-01

    All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.

  6. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system...

  7. SOA-Driven Business-Software Alignment

    NARCIS (Netherlands)

    Shishkov, Boris; van Sinderen, Marten J.; Quartel, Dick; Tsai, W.; Chung, J.; Younas, M.

    2006-01-01

    The alignment of business processes and their supporting application software is a major concern during the initial software design phases. This paper proposes a design approach addressing this problem of business-software alignment. The approach takes an initial business model as a basis in

  8. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  9. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  10. Model-Driven Development for PDS4 Software and Services

    Science.gov (United States)

    Hughes, J. S.; Crichton, D. J.; Algermissen, S. S.; Cayanan, M. D.; Joyner, R. S.; Hardman, S. H.; Padams, J. H.

    2018-04-01

    PDS4 data product labels provide the information necessary for processing the referenced digital object. However, significantly more information is available in the PDS4 Information Model. This additional information is made available for use, by both software and services, to configure, promote resiliency, and improve interoperability.

  11. Editorial - Special Issue on Model-driven Service-oriented architectures

    NARCIS (Netherlands)

    Andrade Almeida, João; Ferreira Pires, Luis; van Sinderen, Marten J.; Steen, M.W.A.

    2009-01-01

    Model-driven approaches to software development have proliferated in recent years owing to the availability of techniques based on metamodelling and model transformations, such as the meta-object facility (MOF) and the query view transformation (QVT) standards. During the same period,

  12. The Evolution of Software in High Energy Physics

    International Nuclear Information System (INIS)

    Brun, René

    2012-01-01

    The paper reviews the evolution of the software in High Energy Physics from the time of expensive mainframes to grids and clouds systems using thousands of multi-core processors. It focuses on the key parameters or events that have shaped the current software infrastructure.

  13. Evolution of the ATLAS Software Framework towards Concurrency

    CERN Document Server

    Jones, Roger; The ATLAS collaboration; Leggett, Charles; Wynne, Benjamin

    2015-01-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather r...

  14. Stochastic modeling indicates that aging and somatic evolution in the hematopoetic system are driven by non-cell-autonomous processes.

    Science.gov (United States)

    Rozhok, Andrii I; Salstrom, Jennifer L; DeGregori, James

    2014-12-01

    Age-dependent tissue decline and increased cancer incidence are widely accepted to be rate-limited by the accumulation of somatic mutations over time. Current models of carcinogenesis are dominated by the assumption that oncogenic mutations have defined advantageous fitness effects on recipient stem and progenitor cells, promoting and rate-limiting somatic evolution. However, this assumption is markedly discrepant with evolutionary theory, whereby fitness is a dynamic property of a phenotype imposed upon and widely modulated by environment. We computationally modeled dynamic microenvironment-dependent fitness alterations in hematopoietic stem cells (HSC) within the Sprengel-Liebig system known to govern evolution at the population level. Our model for the first time integrates real data on age-dependent dynamics of HSC division rates, pool size, and accumulation of genetic changes and demonstrates that somatic evolution is not rate-limited by the occurrence of mutations, but instead results from aged microenvironment-driven alterations in the selective/fitness value of previously accumulated genetic changes. Our results are also consistent with evolutionary models of aging and thus oppose both somatic mutation-centric paradigms of carcinogenesis and tissue functional decline. In total, we demonstrate that aging directly promotes HSC fitness decline and somatic evolution via non-cell-autonomous mechanisms.

  15. The Evolution of Software and Its Impact on Complex System Design in Robotic Spacecraft Embedded Systems

    Science.gov (United States)

    Butler, Roy

    2013-01-01

    The growth in computer hardware performance, coupled with reduced energy requirements, has led to a rapid expansion of the resources available to software systems, driving them towards greater logical abstraction, flexibility, and complexity. This shift in focus from compacting functionality into a limited field towards developing layered, multi-state architectures in a grand field has both driven and been driven by the history of embedded processor design in the robotic spacecraft industry.The combinatorial growth of interprocess conditions is accompanied by benefits (concurrent development, situational autonomy, and evolution of goals) and drawbacks (late integration, non-deterministic interactions, and multifaceted anomalies) in achieving mission success, as illustrated by the case of the Mars Reconnaissance Orbiter. Approaches to optimizing the benefits while mitigating the drawbacks have taken the form of the formalization of requirements, modular design practices, extensive system simulation, and spacecraft data trend analysis. The growth of hardware capability and software complexity can be expected to continue, with future directions including stackable commodity subsystems, computer-generated algorithms, runtime reconfigurable processors, and greater autonomy.

  16. How Evolution May Work Through Curiosity-Driven Developmental Process.

    Science.gov (United States)

    Oudeyer, Pierre-Yves; Smith, Linda B

    2016-04-01

    Infants' own activities create and actively select their learning experiences. Here we review recent models of embodied information seeking and curiosity-driven learning and show that these mechanisms have deep implications for development and evolution. We discuss how these mechanisms yield self-organized epigenesis with emergent ordered behavioral and cognitive developmental stages. We describe a robotic experiment that explored the hypothesis that progress in learning, in and for itself, generates intrinsic rewards: The robot learners probabilistically selected experiences according to their potential for reducing uncertainty. In these experiments, curiosity-driven learning led the robot learner to successively discover object affordances and vocal interaction with its peers. We explain how a learning curriculum adapted to the current constraints of the learning system automatically formed, constraining learning and shaping the developmental trajectory. The observed trajectories in the robot experiment share many properties with those in infant development, including a mixture of regularities and diversities in the developmental patterns. Finally, we argue that such emergent developmental structures can guide and constrain evolution, in particular with regard to the origins of language. Copyright © 2016 Cognitive Science Society, Inc.

  17. HIV-specific probabilistic models of protein evolution.

    Directory of Open Access Journals (Sweden)

    David C Nickle

    2007-06-01

    Full Text Available Comparative sequence analyses, including such fundamental bioinformatics techniques as similarity searching, sequence alignment and phylogenetic inference, have become a mainstay for researchers studying type 1 Human Immunodeficiency Virus (HIV-1 genome structure and evolution. Implicit in comparative analyses is an underlying model of evolution, and the chosen model can significantly affect the results. In general, evolutionary models describe the probabilities of replacing one amino acid character with another over a period of time. Most widely used evolutionary models for protein sequences have been derived from curated alignments of hundreds of proteins, usually based on mammalian genomes. It is unclear to what extent these empirical models are generalizable to a very different organism, such as HIV-1-the most extensively sequenced organism in existence. We developed a maximum likelihood model fitting procedure to a collection of HIV-1 alignments sampled from different viral genes, and inferred two empirical substitution models, suitable for describing between-and within-host evolution. Our procedure pools the information from multiple sequence alignments, and provided software implementation can be run efficiently in parallel on a computer cluster. We describe how the inferred substitution models can be used to generate scoring matrices suitable for alignment and similarity searches. Our models had a consistently superior fit relative to the best existing models and to parameter-rich data-driven models when benchmarked on independent HIV-1 alignments, demonstrating evolutionary biases in amino-acid substitution that are unique to HIV, and that are not captured by the existing models. The scoring matrices derived from the models showed a marked difference from common amino-acid scoring matrices. The use of an appropriate evolutionary model recovered a known viral transmission history, whereas a poorly chosen model introduced phylogenetic

  18. Ontology Driven Meta-Modeling of Service Oriented Architecture

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... #Department of Computer Applications, National Institute of ... *5Department of Computer Science, Winona State University, MN, USA ..... Further, it has aided in service .... Software: A Research Roadmap”, Workshop on the Future of ... and A. Solberg, “Model-driven service engineering with SoaML”, in.

  19. Construction of UML class diagram with Model-Driven Development

    Directory of Open Access Journals (Sweden)

    Tomasz Górski

    2016-03-01

    Full Text Available Model transformations play a key role in software development projects based on Model--Driven Development (MDD principles. Transformations allow for automation of repetitive and well-defined steps, thus shortening design time and reducing a number of errors. In the object-oriented approach, the key elements are use cases. They are described, modelled and later designed until executable application code is obtained. The aim of the paper is to present transformation of a model-to-model type, Communication-2-Class, which automates construction of Unified Modelling Language (UML class diagram in the context of the analysis/design model. An UML class diagram is created based on UML communication diagram within use case realization. As a result, a class diagram shows all of the classes involved in the use case realization and the relationships among them. The plug-in which implements Communication-2-Class transformation was implemented in the IBM Rational Software Architect. The article presents the tests results of developed plug-in, which realizes Communication-2-Class transformation, showing capabilities of shortening use case realization’s design time.[b]Keywords[/b]: Model-Driven Development, transformations, Unified Modelling Language, analysis/design model, UML class diagram, UML communication diagram

  20. The MDE Diploma: First International Postgraduate Specialization in Model-Driven Engineering

    Science.gov (United States)

    Cabot, Jordi; Tisi, Massimo

    2011-01-01

    Model-Driven Engineering (MDE) is changing the way we build, operate, and maintain our software-intensive systems. Several projects using MDE practices are reporting significant improvements in quality and performance but, to be able to handle these projects, software engineers need a set of technical and interpersonal skills that are currently…

  1. Composable Framework Support for Software-FMEA Through Model Execution

    Science.gov (United States)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  2. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  3. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  4. The evolution of CMS software performance studies

    CERN Document Server

    Kortelainen, Matti J

    2010-01-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  5. The evolution of CMS software performance studies

    Science.gov (United States)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  6. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  7. Towards an Experimental Framework for Measuring Usability of Model-Driven Tools

    NARCIS (Netherlands)

    Panach, Jose Ignacio; Condori-Fernandez, Nelly; Baar, Arthur; Vos, Tanja; Romeu, Ignacio; Pastor, Oscar; Campos, Pedro; Graham, Nicholas; Jorge, Joaquim; Nunes, Nuno; Palanque, Philippe; Winckler, Marco

    2011-01-01

    According to the Model-Driven Development (MDD) paradigm, analysts can substantially improve the software development process concentrating their efforts on a conceptual model, which can be transformed into code by means of transformation rules applied by a model compiler. However, MDD tools are not

  8. Supporting the evolution of research in software ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2016-01-01

    with significant lack of software ecosystem specific theories that are solid, mature, generic, and detailed enough to be measurable and transferable. In this study, we intent to come closer to an evolution of the field by supporting the “localization” of research, i.e. the focus on specific types of software...... software ecosystem studies lack deeper investigation of technical and collaborative aspects. Moreover, we identify an increased focus on organizational aspects and a rather limited focus on business. Furthermore, we identify common technology as the component investigated most in the ecosystems, both from...

  9. Executable Use Cases: a Supplement to Model-Driven Development?

    DEFF Research Database (Denmark)

    Jørgensen, Jens Bæk

    2007-01-01

    -level requirements and more technical software specifications. In MDD, userlevel requirements are not always explicitly described; it is sufficient for MDD that a specification, or platformindependent model, of the software that we are going to develop is provided. Therefore, a combination of EUCs and MDD may have......Executable Use Cases (EUCs) is a model-based approach to requirements engineering. In the introduction to this paper, we briefly discuss how EUCs may be used as a supplement to Model-Driven Development (MDD). Then we present the EUC approach in more detail. An EUC can describe and link user...... potential to cover the full software engineering path from user-level requirements via specifications to implementations of running computer systems....

  10. Formal Model-Driven Engineering: Generating Data and Behavioural Components

    Directory of Open Access Journals (Sweden)

    Chen-Wei Wang

    2012-12-01

    Full Text Available Model-driven engineering is the automatic production of software artefacts from abstract models of structure and functionality. By targeting a specific class of system, it is possible to automate aspects of the development process, using model transformations and code generators that encode domain knowledge and implementation strategies. Using this approach, questions of correctness for a complex, software system may be answered through analysis of abstract models of lower complexity, under the assumption that the transformations and generators employed are themselves correct. This paper shows how formal techniques can be used to establish the correctness of model transformations used in the generation of software components from precise object models. The source language is based upon existing, formal techniques; the target language is the widely-used SQL notation for database programming. Correctness is established by giving comparable, relational semantics to both languages, and checking that the transformations are semantics-preserving.

  11. Strong Stellar-driven Outflows Shape the Evolution of Galaxies at Cosmic Dawn

    Energy Technology Data Exchange (ETDEWEB)

    Fontanot, Fabio; De Lucia, Gabriella [INAF—Astronomical Observatory of Trieste, via G.B. Tiepolo 11, I-34143 Trieste (Italy); Hirschmann, Michaela [Sorbonne Universités, UPMC-CNRS, UMR7095, Institut d’Astrophysique de Paris, F-75014 Paris (France)

    2017-06-20

    We study galaxy mass assembly and cosmic star formation rate (SFR) at high redshift (z ≳ 4), by comparing data from multiwavelength surveys with predictions from the GAlaxy Evolution and Assembly (gaea) model. gaea implements a stellar feedback scheme partially based on cosmological hydrodynamical simulations, which features strong stellar-driven outflows and mass-dependent timescales for the re-accretion of ejected gas. In previous work, we have shown that this scheme is able to correctly reproduce the evolution of the galaxy stellar mass function (GSMF) up to z ∼ 3. We contrast model predictions with both rest-frame ultraviolet (UV) and optical luminosity functions (LFs), which are mostly sensitive to the SFR and stellar mass, respectively. We show that gaea is able to reproduce the shape and redshift evolution of both sets of LFs. We study the impact of dust on the predicted LFs, and we find that the required level of dust attenuation is in qualitative agreement with recent estimates based on the UV continuum slope. The consistency between data and model predictions holds for the redshift evolution of the physical quantities well beyond the redshift range considered for the calibration of the original model. In particular, we show that gaea is able to recover the evolution of the GSMF up to z ∼ 7 and the cosmic SFR density up to z ∼ 10.

  12. Strong Stellar-driven Outflows Shape the Evolution of Galaxies at Cosmic Dawn

    International Nuclear Information System (INIS)

    Fontanot, Fabio; De Lucia, Gabriella; Hirschmann, Michaela

    2017-01-01

    We study galaxy mass assembly and cosmic star formation rate (SFR) at high redshift (z ≳ 4), by comparing data from multiwavelength surveys with predictions from the GAlaxy Evolution and Assembly (gaea) model. gaea implements a stellar feedback scheme partially based on cosmological hydrodynamical simulations, which features strong stellar-driven outflows and mass-dependent timescales for the re-accretion of ejected gas. In previous work, we have shown that this scheme is able to correctly reproduce the evolution of the galaxy stellar mass function (GSMF) up to z ∼ 3. We contrast model predictions with both rest-frame ultraviolet (UV) and optical luminosity functions (LFs), which are mostly sensitive to the SFR and stellar mass, respectively. We show that gaea is able to reproduce the shape and redshift evolution of both sets of LFs. We study the impact of dust on the predicted LFs, and we find that the required level of dust attenuation is in qualitative agreement with recent estimates based on the UV continuum slope. The consistency between data and model predictions holds for the redshift evolution of the physical quantities well beyond the redshift range considered for the calibration of the original model. In particular, we show that gaea is able to recover the evolution of the GSMF up to z ∼ 7 and the cosmic SFR density up to z ∼ 10.

  13. Model Driven Development of Web Application with SPACE Method and Tool-suit

    OpenAIRE

    Rehana, Jinat

    2010-01-01

    Enterprise level software development using traditional software engineeringapproaches with third-generation programming languages is becoming morechallenging and cumbersome task with the increased complexity of products,shortened development cycles and heightened expectations of quality. MDD(Model Driven Development) has been counting as an exciting and magicaldevelopment approach in the software industry from several years. The ideabehind MDD is the separation of business logic of a system ...

  14. Chaste: A test-driven approach to software development for biological modelling

    KAUST Repository

    Pitt-Francis, Joe; Pathmanathan, Pras; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Fletcher, Alexander G.; Mirams, Gary R.; Murray, Philip; Osborne, James M.; Walter, Alex; Chapman, S. Jon; Garny, Alan; van Leeuwen, Ingeborg M.M.; Maini, Philip K.; Rodrí guez, Blanca; Waters, Sarah L.; Whiteley, Jonathan P.; Byrne, Helen M.; Gavaghan, David J.

    2009-01-01

    Chaste ('Cancer, heart and soft-tissue environment') is a software library and a set of test suites for computational simulations in the domain of biology. Current functionality has arisen from modelling in the fields of cancer, cardiac physiology

  15. Modeling density-driven flow in porous media principles, numerics, software

    CERN Document Server

    Holzbecher, Ekkehard O

    1998-01-01

    Modeling of flow and transport in groundwater has become an important focus of scientific research in recent years. Most contributions to this subject deal with flow situations, where density and viscosity changes in the fluid are neglected. This restriction may not always be justified. The models presented in the book demonstrate immpressingly that the flow pattern may be completely different when density changes are taken into account. The main applications of the models are: thermal and saline convection, geothermal flow, saltwater intrusion, flow through salt formations etc. This book not only presents basic theory, but the reader can also test his knowledge by applying the included software and can set up own models.

  16. Model-driven design, simulation and implementation of service compositions in COSMO

    NARCIS (Netherlands)

    Quartel, Dick; Dirgahayu, T.; van Sinderen, Marten J.

    2009-01-01

    The success of software development projects to a large extent depends on the quality of the models that are produced in the development process, which in turn depends on the conceptual and practical support that is available for modelling, design and analysis. This paper focuses on model-driven

  17. 25 Years of Model-Driven Web Engineering: What we achieved, What is missing

    Directory of Open Access Journals (Sweden)

    Gustavo Rossi

    2016-12-01

    Full Text Available Model-Driven Web Engineering (MDWE approaches aim to improve the Web applications development process by focusing on modeling instead of coding, and deriving the running application by transformations from conceptual models to code. The emergence of the Interaction Flow Modeling Language (IFML has been an important milestone in the evolution of Web modeling languages, indicating not only the maturity of the field but also a final convergence of languages. In this paper we explain the evolution of modeling and design approaches since the early years (in the 90’s detailing the forces which drove that evolution and discussing the strengths and weaknesses of some of those approaches. A brief presentation of the IFML is accompanied with a thorough analysis of the most important achievements of the MDWE community as well as the problems and obstacles that hinder the dissemination of model-driven techniques in the Web engineering field.

  18. A Model-driven and Service-oriented framework for the business process improvement

    Directory of Open Access Journals (Sweden)

    Andrea Delgado

    2010-07-01

    Full Text Available Business Process Management (BPM importance and benefits for organizations to focus on their business processes is nowadays broadly recognized, as business and technology areas are embracing and adopting the paradigm. The Service Oriented Computing (SOC paradigm bases software development on services to realize business processes. The implementation of business processes as services helps in reducing the gap between these two areas, easing the communication and understanding of business needs. The Model Driven Development (MDD paradigm bases software development in models, metamodels and languages that allow transformation between them. The automatic generation of service models from business process models is a key issue to support the separation of its definition from its technical implementation. In this article, we present MINERVA framework which applies Model Driven Development (MDD and Service Oriented Computing (SOC paradigms to business processes for the continuous business process improvement in organizations, giving support to the stages defined in the business process lifecycle from modeling to evaluation of its execution.

  19. Towards a comprehensive framework for reuse: A reuse-enabling software evolution environment

    Science.gov (United States)

    Basili, V. R.; Rombach, H. D.

    1988-01-01

    Reuse of products, processes and knowledge will be the key to enable the software industry to achieve the dramatic improvement in productivity and quality required to satisfy the anticipated growing demand. Although experience shows that certain kinds of reuse can be successful, general success has been elusive. A software life-cycle technology which allows broad and extensive reuse could provide the means to achieving the desired order-of-magnitude improvements. The scope of a comprehensive framework for understanding, planning, evaluating and motivating reuse practices and the necessary research activities is outlined. As a first step towards such a framework, a reuse-enabling software evolution environment model is introduced which provides a basis for the effective recording of experience, the generalization and tailoring of experience, the formalization of experience, and the (re-)use of experience.

  20. The iCub Software Architecture: evolution and lessons learned

    Directory of Open Access Journals (Sweden)

    Lorenzo eNatale

    2016-04-01

    Full Text Available The complexity of humanoid robots is increasing with the availability of new sensors, embedded CPUs and actuators. This wealth of technologies allows researchers to investigate new problems like whole-body force control, multi-modal human-robot interaction and sensory fusion. Under the hood of these robots, the software architecture has an important role: it allows researchers to get access to the robot functionalities focusing primarily on their research problems, it supports code reuse to minimize development and debugging, especially when new hardware becomes available. But more importantly it allows increasing the complexity of the experiments that can be implemented before system integration becomes unmanageable and debugging draws more resources than research itself.In this paper we illustrate the software architecture of the iCub humanoid robot and the software engineering best practices that have emerged driven by the needs of our research community. We describe the latest developments at the level of the middleware supporting interface definition and automatic code generation, logging, ROS compatibility and channel prioritization. We show the robot abstraction layer and how it has been modified to better address the requirements of the users and to support new hardware as it became available. We also describe the testing framework we have recently adopted for developing code using a test driven methodology. We conclude the paper discussing the lessons we have learned during the past eleven years of software development on the iCub humanoid robot.

  1. The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.

    Science.gov (United States)

    Bontis, Nick; Chung, Honsan

    2000-01-01

    Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…

  2. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    Science.gov (United States)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  3. A Software Development Platform for Mechatronic Systems

    DEFF Research Database (Denmark)

    Guan, Wei

    Software has become increasingly determinative for development of mechatronic systems, which underscores the importance of demands for shortened time-to-market, increased productivity, higher quality, and improved dependability. As the complexity of systems is dramatically increasing, these demands...... present a challenge to the practitioners who adopt conventional software development approach. An effective approach towards industrial production of software for mechatronic systems is needed. This approach requires a disciplined engineering process that encompasses model-driven engineering and component......-based software engineering, whereby we enable incremental software development using component models to address the essential design issues of real-time embedded systems. To this end, this dissertation presents a software development platform that provides an incremental model-driven development process based...

  4. BioContainers: an open-source and community-driven framework for software standardization

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  5. BioContainers: an open-source and community-driven framework for software standardization.

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  6. SaLEM (v1.0) - the Soil and Landscape Evolution Model (SaLEM) for simulation of regolith depth in periglacial environments

    Science.gov (United States)

    Bock, Michael; Conrad, Olaf; Günther, Andreas; Gehrt, Ernst; Baritz, Rainer; Böhner, Jürgen

    2018-04-01

    We propose the implementation of the Soil and Landscape Evolution Model (SaLEM) for the spatiotemporal investigation of soil parent material evolution following a lithologically differentiated approach. Relevant parts of the established Geomorphic/Orogenic Landscape Evolution Model (GOLEM) have been adapted for an operational Geographical Information System (GIS) tool within the open-source software framework System for Automated Geoscientific Analyses (SAGA), thus taking advantage of SAGA's capabilities for geomorphometric analyses. The model is driven by palaeoclimatic data (temperature, precipitation) representative of periglacial areas in northern Germany over the last 50 000 years. The initial conditions have been determined for a test site by a digital terrain model and a geological model. Weathering, erosion and transport functions are calibrated using extrinsic (climatic) and intrinsic (lithologic) parameter data. First results indicate that our differentiated SaLEM approach shows some evidence for the spatiotemporal prediction of important soil parental material properties (particularly its depth). Future research will focus on the validation of the results against field data, and the influence of discrete events (mass movements, floods) on soil parent material formation has to be evaluated.

  7. PROJECT-DRIVEN SOFTWARE BUSINESS IN TRANSILVANIA - A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Radu Marius

    2015-07-01

    Full Text Available The fairly low salaries of the IT workers compared to the Western countries, the skills and the location have supported outsourcing become one of the most competitive Romanian sectors. IT sector in Romania maintains a steady growth favoured by outsourcing companies. Moreover Romania is highly competitive when you take into account the level of technical proficiency and soft skills in the country. Romanian labour force can drive relevant projects even in small teams. This case study explores the realty of Romanian IT companies profiles. It presents in comparison two companies bases on organizational and strategic dimensions: project approach orientation, leadership, project value driven, and social responsibility. The corporate goal of the first company presented in the case study - Fortech - is to achieve the best adaptive organizational structure which can sustain its competitive advantage. This advantage results from combination of three main ingredients: scaled up human resource capital, versatile knowledge management and adaptability to customer needs. Fortech manages and administrates and execute their business activities using project management methodologies and practices in order to achieve the strategic goals. On the other hand Dolphin Kiss Company is a “Python boutique agency” created around a single contract and organized on a single project. The project was contracted with a top company from telecommunication industry. The company is a small team of creative software engineers focused on developing a very innovative software business solution. This case study is an empirical qualitative research intended to depict the main differences between two relevant company profiles present in the actual economic context: small team – results oriented – highly skilled VS large structure of outsourcing teams – matrix organized – customer oriented. The case study constructs a space for debates regarding the potential evolution of the

  8. Testing the Accuracy of Data-driven MHD Simulations of Active Region Evolution

    Energy Technology Data Exchange (ETDEWEB)

    Leake, James E.; Linton, Mark G. [U.S. Naval Research Laboratory, 4555 Overlook Avenue, SW, Washington, DC 20375 (United States); Schuck, Peter W., E-mail: james.e.leake@nasa.gov [NASA Goddard Space Flight Center, 8800 Greenbelt Road, Greenbelt, MD 20771 (United States)

    2017-04-01

    Models for the evolution of the solar coronal magnetic field are vital for understanding solar activity, yet the best measurements of the magnetic field lie at the photosphere, necessitating the development of coronal models which are “data-driven” at the photosphere. We present an investigation to determine the feasibility and accuracy of such methods. Our validation framework uses a simulation of active region (AR) formation, modeling the emergence of magnetic flux from the convection zone to the corona, as a ground-truth data set, to supply both the photospheric information and to perform the validation of the data-driven method. We focus our investigation on how the accuracy of the data-driven model depends on the temporal frequency of the driving data. The Helioseismic and Magnetic Imager on NASA’s Solar Dynamics Observatory produces full-disk vector magnetic field measurements at a 12-minute cadence. Using our framework we show that ARs that emerge over 25 hr can be modeled by the data-driving method with only ∼1% error in the free magnetic energy, assuming the photospheric information is specified every 12 minutes. However, for rapidly evolving features, under-sampling of the dynamics at this cadence leads to a strobe effect, generating large electric currents and incorrect coronal morphology and energies. We derive a sampling condition for the driving cadence based on the evolution of these small-scale features, and show that higher-cadence driving can lead to acceptable errors. Future work will investigate the source of errors associated with deriving plasma variables from the photospheric magnetograms as well as other sources of errors, such as reduced resolution, instrument bias, and noise.

  9. Test Driven Development of Scientific Models

    Science.gov (United States)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  10. Elsevier special issue on foundations and applications of model driven architecture

    NARCIS (Netherlands)

    Aksit, Mehmet; Ivanov, Ivan

    2008-01-01

    Model Driven Architecture (MDA) is an approach for software development proposed by Object Management Group (OMG). The basic principle of MDA is the separation of the specification of system functionality from the specification of the implementation of that functionality on a specific platform. The

  11. A software for parameter optimization with Differential Evolution Entirely Parallel method

    Directory of Open Access Journals (Sweden)

    Konstantin Kozlov

    2016-08-01

    Full Text Available Summary. Differential Evolution Entirely Parallel (DEEP package is a software for finding unknown real and integer parameters in dynamical models of biological processes by minimizing one or even several objective functions that measure the deviation of model solution from data. Numerical solutions provided by the most efficient global optimization methods are often problem-specific and cannot be easily adapted to other tasks. In contrast, DEEP allows a user to describe both mathematical model and objective function in any programming language, such as R, Octave or Python and others. Being implemented in C, DEEP demonstrates as good performance as the top three methods from CEC-2014 (Competition on evolutionary computation benchmark and was successfully applied to several biological problems. Availability. DEEP method is an open source and free software distributed under the terms of GPL licence version 3. The sources are available at http://deepmethod.sourceforge.net/ and binary packages for Fedora GNU/Linux are provided for RPM package manager at https://build.opensuse.org/project/repositories/home:mackoel:compbio.

  12. Model-driven development of smart grid services using SoaML

    DEFF Research Database (Denmark)

    Kosek, Anna Magdalena; Gehrke, Oliver

    2014-01-01

    This paper presents a model-driven software devel- opment process which can be applied to the design of smart grid services. The Service Oriented Architecture Modelling Language (SoaML) is used to describe the architecture as well as the roles and interactions between service participants....... The individual modelling steps and an example design of a SoaML model for a voltage control service are presented and explained. Finally, the paper discusses a proof-of-concept implementation of the modelled service in a smart grid testing laboratory....

  13. Reflection of a Year Long Model-Driven Business and UI Modeling Development Project

    Science.gov (United States)

    Sukaviriya, Noi; Mani, Senthil; Sinha, Vibha

    Model-driven software development enables users to specify an application at a high level - a level that better matches problem domain. It also promises the users with better analysis and automation. Our work embarks on two collaborating domains - business process and human interactions - to build an application. Business modeling expresses business operations and flows then creates business flow implementation. Human interaction modeling expresses a UI design, its relationship with business data, logic, and flow, and can generate working UI. This double modeling approach automates the production of a working system with UI and business logic connected. This paper discusses the human aspects of this modeling approach after a year long of building a procurement outsourcing contract application using the approach - the result of which was deployed in December 2008. The paper discusses in multiple areas the happy endings and some heartache. We end with insights on how a model-driven approach could do better for humans in the process.

  14. Observation-Driven Configuration of Complex Software Systems

    Science.gov (United States)

    Sage, Aled

    2010-06-01

    The ever-increasing complexity of software systems makes them hard to comprehend, predict and tune due to emergent properties and non-deterministic behaviour. Complexity arises from the size of software systems and the wide variety of possible operating environments: the increasing choice of platforms and communication policies leads to ever more complex performance characteristics. In addition, software systems exhibit different behaviour under different workloads. Many software systems are designed to be configurable so that policies can be chosen to meet the needs of various stakeholders. For complex software systems it can be difficult to accurately predict the effects of a change and to know which configuration is most appropriate. This thesis demonstrates that it is useful to run automated experiments that measure a selection of system configurations. Experiments can find configurations that meet the stakeholders' needs, find interesting behavioural characteristics, and help produce predictive models of the system's behaviour. The design and use of ACT (Automated Configuration Tool) for running such experiments is described, in combination a number of search strategies for deciding on the configurations to measure. Design Of Experiments (DOE) is discussed, with emphasis on Taguchi Methods. These statistical methods have been used extensively in manufacturing, but have not previously been used for configuring software systems. The novel contribution here is an industrial case study, applying the combination of ACT and Taguchi Methods to DC-Directory, a product from Data Connection Ltd (DCL). The case study investigated the applicability of Taguchi Methods for configuring complex software systems. Taguchi Methods were found to be useful for modelling and configuring DC- Directory, making them a valuable addition to the techniques available to system administrators and developers.

  15. Evolution and Adaptation in Pseudomonas aeruginosa Biofilms Driven by Mismatch Repair System-Deficient Mutators

    DEFF Research Database (Denmark)

    Luján, Adela M.; Maciá, María D.; Yang, Liang

    2011-01-01

    , which are rarely eradicated despite intensive antibiotic therapy. Current knowledge indicates that three major adaptive strategies, biofilm development, phenotypic diversification, and mutator phenotypes [driven by a defective mismatch repair system (MRS)], play important roles in P. aeruginosa chronic...... infections, but the relationship between these strategies is still poorly understood. We have used the flow-cell biofilm model system to investigate the impact of the mutS associated mutator phenotype on development, dynamics, diversification and adaptation of P. aeruginosa biofilms. Through competition...... diversification, evidenced by biofilm architecture features and by a wider range and proportion of morphotypic colony variants, respectively. Additionally, morphotypic variants generated in mutator biofilms showed increased competitiveness, providing further evidence for mutator-driven adaptive evolution...

  16. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    Science.gov (United States)

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test

  17. Automated analyses of model-driven artifacts : obtaining insights into industrial application of MDE

    NARCIS (Netherlands)

    Mengerink, J.G.M.; Serebrenik, A.; Schiffelers, R.R.H.; van den Brand, M.G.J.

    2017-01-01

    Over the past years, there has been an increase in the application of model driven engineering in industry. Similar to traditional software engineering, understanding how technologies are actually used in practice is essential for developing good tooling, and decision making processes.

  18. Modeling the geographical studies with GeoGebra-software

    Directory of Open Access Journals (Sweden)

    Ionica Soare

    2010-01-01

    Full Text Available The problem of mathematical modeling in geography is one of the most important strategies in order to establish the evolution and the prevision of geographical phenomena. Models must have a simplified structure, to reflect essential components and must be selective, structured, and suggestive and approximate the reality. Models could be static or dynamic, developed in a theoretical, symbolic, conceptual or mental way, mathematically modeled. The present paper is focused on the virtual model which uses GeoGebra software, free and available at www.geogebra.org, in order to establish new methods of geographical analysis in a dynamic, didactic way.

  19. Modeling and evaluation of HE driven shock effects in copper with the MTS model

    International Nuclear Information System (INIS)

    Murphy, M.J.; Lassila, D.F.

    1997-01-01

    Many experimental studies have investigated the effect of shock pressure on the post-shock mechanical properties of OFHC copper. These studies have shown that significant hardening occurs during shock loading due to dislocation processes and twinning. It has been demonstrated that when an appropriate initial value of the Mechanical Threshold Stress (MTS) is specified, the post-shock flow stress of OFE copper is well described by relationships derived independently for unshocked materials. In this study we consider the evolution of the MTS during HE driven shock loading processes and the effect on the subsequent flow stress of the copper. An increased post shock flow stress results in a higher material temperature due to an increase in the plastic work. An increase in temperature leads to thermal softening which reduces the flow stress. These coupled effects will determine if there is melting in a shaped charge jet or a necking instability in an EFP Ww. 'Me critical factor is the evolution path followed combined with the 'current' temperature, plastic strain, and strain rate. Preliminary studies indicate that in simulations of HE driven shock with very high resolution zoning, the MTS saturates because of the rate dependence in the evolution law. On going studies are addressing this and other issues with the goal of developing a version of the MT'S model that treats HE driven, shock loading, temperature, strain, and rate effects apriori

  20. Fast accelerator driven subcritical system for energy production: nuclear fuel evolution

    International Nuclear Information System (INIS)

    Barros, Graiciany de P.; Pereira, Claubia; Veloso, Maria A.F.; Costa, Antonella L.

    2011-01-01

    Accelerators Driven Systems (ADS) are an innovative type of nuclear system, which is useful for long-lived fission product transmutation and fuel regeneration. The ADS consist of a coupling of a sub-critical nuclear core reactor and a proton beam produced by a particle accelerator. These particles are injected into a target for the neutrons production by spallation reactions. The neutrons are then used to maintain the fission chain in the sub-critical core. The aim of this study is to investigate the nuclear fuel evolution of a lead cooled accelerator driven system used for energy production. The fuel studied is a mixture based upon "2"3"2Th and "2"3"3U. Since thorium is an abundant fertile material, there is hope for the thorium-cycle fuels for an accelerator driven sub-critical system. The target is a lead spallation target and the core is filled with a hexagonal lattice. High energy neutrons are used to reduce the negative reactivity caused by the presence of protoactinium, since this effect is most pronounced in the thermal range of the neutron spectrum. For that reason, such material is not added moderator to the system. In this work is used the Monte Carlo code MCNPX 2.6.0, that presents the the depletion/ burnup capability. The k_e_f_f evolution, the neutron energy spectrum in the core and the nuclear fuel evolution using ADS source (SDEF) and kcode-mode are evaluated during the burnup. (author)

  1. State-of-the-Art: Evolution of Software Life Cycle Process for NPPs

    International Nuclear Information System (INIS)

    Suh, Yong Suk; Park, Heui Youn; Son, Ki Sung; Lee, Ki Hyun; Kim, Hyeon Soo

    2007-01-01

    This paper is to investigate the evolution of software life cycle process (SLCP) for nuclear power plants (NPPs) based on IEEE Std 7-4.3.2 which has been updated twice (namely 1993 and 2003 ) since it was published in 1982 and relevant software certifications. IEEE Std 7-4.3.2 specifies additional computer specific requirements to supplement the criteria and requirements of IEEE Std 603. It also specifies the software quality requirements as follows: computer software shall be developed, modified, or accepted in accordance with an approved software quality assurance (QA) plan. IEEE Std 7-4.3.2-1982 specifies a minimum software development process as follows: plan, design and implementation. ANSI/ASME NQA-1-1979 is not directly related to software development process but to overall quality assurance criteria. IEEE Std 7-4.3.2-1993 addresses ASME NQA-2a-1990 Part 2.7 for software development requirements. ASME NQA-2a-1990 Part 2.7 which was interpreted into KEPIC QAP-2 II.7, specifies software development process in more detail as follows: requirements, design, implementation, test, installation and checkout, operation and maintenance, and retirement. Along with this, software QA plan is emphasized in IEEE Std 730-1989. In IEEE Std 7-4.3.2-2003, IEEE/EIA Std 12207.0-1996 replaces the ASME NQA as a requirement for software development. The evolution of SLCP from ASME NQA to IEEE/EIA Std 12207.0 is discussed in Section 2 of this paper. The publication of IEEE/EIA Std 12207.0 is motivated from industrial experiences and practices to promote the quality of software. In Section 3, three international software certifications relating to the IEEE/EIA Std 12207.0 are introduced

  2. From napkin sketches to reliable software

    NARCIS (Netherlands)

    Engelen, L.J.P.

    2012-01-01

    In the past few years, model-driven software engineering (MDSE) and domain-specific modeling languages (DSMLs) have received a lot of attention from both research and industry. The main goal of MDSE is generating software from models that describe systems on a high level of abstraction. DSMLs are

  3. SaLEM (v1.0 – the Soil and Landscape Evolution Model (SaLEM for simulation of regolith depth in periglacial environments

    Directory of Open Access Journals (Sweden)

    M. Bock

    2018-04-01

    Full Text Available We propose the implementation of the Soil and Landscape Evolution Model (SaLEM for the spatiotemporal investigation of soil parent material evolution following a lithologically differentiated approach. Relevant parts of the established Geomorphic/Orogenic Landscape Evolution Model (GOLEM have been adapted for an operational Geographical Information System (GIS tool within the open-source software framework System for Automated Geoscientific Analyses (SAGA, thus taking advantage of SAGA's capabilities for geomorphometric analyses. The model is driven by palaeoclimatic data (temperature, precipitation representative of periglacial areas in northern Germany over the last 50 000 years. The initial conditions have been determined for a test site by a digital terrain model and a geological model. Weathering, erosion and transport functions are calibrated using extrinsic (climatic and intrinsic (lithologic parameter data. First results indicate that our differentiated SaLEM approach shows some evidence for the spatiotemporal prediction of important soil parental material properties (particularly its depth. Future research will focus on the validation of the results against field data, and the influence of discrete events (mass movements, floods on soil parent material formation has to be evaluated.

  4. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  5. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  6. Comparing Transformation Possibilities of Topological Functioning Model and BPMN in the Context of Model Driven Architecture

    Directory of Open Access Journals (Sweden)

    Solomencevs Artūrs

    2016-05-01

    Full Text Available The approach called “Topological Functioning Model for Software Engineering” (TFM4SE applies the Topological Functioning Model (TFM for modelling the business system in the context of Model Driven Architecture. TFM is a mathematically formal computation independent model (CIM. TFM4SE is compared to an approach that uses BPMN as a CIM. The comparison focuses on CIM modelling and on transformation to UML Sequence diagram on the platform independent (PIM level. The results show the advantages and drawbacks the formalism of TFM brings into the development.

  7. CoMET: A Mesquite package for comparing models of continuous character evolution on phylogenies

    Directory of Open Access Journals (Sweden)

    Chunghau Lee

    2006-01-01

    Full Text Available Continuously varying traits such as body size or gene expression level evolve during the history of species or gene lineages. To test hypotheses about the evolution of such traits, the maximum likelihood (ML method is often used. Here we introduce CoMET (Continuous-character Model Evaluation and Testing, which is module for Mesquite that automates likelihood computations for nine different models of trait evolution. Due to its few restrictions on input data, CoMET is applicable to testing a wide range of character evolution hypotheses. The CoMET homepage, which links to freely available software and more detailed usage instructions, is located at http://www.lifesci.ucsb.edu/eemb/labs/oakley/software/comet.htm.

  8. Software Engineering Environment for Component-based Design of Embedded Software

    DEFF Research Database (Denmark)

    Guo, Yu

    2010-01-01

    as well as application models in a computer-aided software engineering environment. Furthermore, component models have been realized following carefully developed design patterns, which provide for an efficient and reusable implementation. The components have been ultimately implemented as prefabricated...... executable objects that can be linked together into an executable application. The development of embedded software using the COMDES framework is supported by the associated integrated engineering environment consisting of a number of tools, which support basic functionalities, such as system modelling......, validation, and executable code generation for specific hardware platforms. Developing such an environment and the associated tools is a highly complex engineering task. Therefore, this thesis has investigated key design issues and analysed existing platforms supporting model-driven software development...

  9. Model-driven engineering of information systems principles, techniques, and practice

    CERN Document Server

    Cretu, Liviu Gabriel

    2015-01-01

    Model-driven engineering (MDE) is the automatic production of software from simplified models of structure and functionality. It mainly involves the automation of the routine and technologically complex programming tasks, thus allowing developers to focus on the true value-adding functionality that the system needs to deliver. This book serves an overview of some of the core topics in MDE. The volume is broken into two sections offering a selection of papers that helps the reader not only understand the MDE principles and techniques, but also learn from practical examples. Also covered are the

  10. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  11. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    Science.gov (United States)

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  12. LAPSUS: soil erosion - landscape evolution model

    Science.gov (United States)

    van Gorp, Wouter; Temme, Arnaud; Schoorl, Jeroen

    2015-04-01

    LAPSUS is a soil erosion - landscape evolution model which is capable of simulating landscape evolution of a gridded DEM by using multiple water, mass movement and human driven processes on multiple temporal and spatial scales. It is able to deal with a variety of human landscape interventions such as landuse management and tillage and it can model their interactions with natural processes. The complex spatially explicit feedbacks the model simulates demonstrate the importance of spatial interaction of human activity and erosion deposition patterns. In addition LAPSUS can model shallow landsliding, slope collapse, creep, solifluction, biological and frost weathering, fluvial behaviour. Furthermore, an algorithm to deal with natural depressions has been added and event-based modelling with an improved infiltration description and dust deposition has been pursued. LAPSUS has been used for case studies in many parts of the world and is continuously developing and expanding. it is now available for third-party and educational use. It has a comprehensive user interface and it is accompanied by a manual and exercises. The LAPSUS model is highly suitable to quantify and understand catchment-scale erosion processes. More information and a download link is available on www.lapsusmodel.nl.

  13. Enabling System Evolution through Configuration Management on the Hardware/Software Boundary

    NARCIS (Netherlands)

    Krikhaar, R.L.; Mosterman, W.; Veerman, N.P.; Verhoef, C.

    2009-01-01

    As the use of software and electronics in modern products is omnipresent and continuously increasing, companies in the embedded systems industry face increasing complexity in controlling and enabling the evolution of their IT-intensive products. Traditionally, product configurations and their

  14. Model-driven Privacy Assessment in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Neureiter, Christian [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-09

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures need to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.

  15. Framework for Computer-Aided Evolution of Object-Oriented Designs

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Aksit, Mehmet

    2008-01-01

    In this paper, we describe a framework for the computer aided evolution of the designs of object-oriented software systems. Evolution mechanisms are software structures that prepare software for certain type of evolutions. The framework uses a database which holds the evolution mechanisms, modeled

  16. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.

    Science.gov (United States)

    Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel

    2018-02-20

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.

  17. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints

    Science.gov (United States)

    Navet, Nicolas; Havet, Lionel

    2018-01-01

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489

  18. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints

    Directory of Open Access Journals (Sweden)

    Sakthivel Manikandan Sundharam

    2018-02-01

    Full Text Available Model-Driven Engineering (MDE is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS. The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller, he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency. This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language, an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.

  19. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  20. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    Science.gov (United States)

    Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.

    2017-02-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA

  1. Robust recurrent neural network modeling for software fault detection and correction prediction

    International Nuclear Information System (INIS)

    Hu, Q.P.; Xie, M.; Ng, S.H.; Levitin, G.

    2007-01-01

    Software fault detection and correction processes are related although different, and they should be studied together. A practical approach is to apply software reliability growth models to model fault detection, and fault correction process is assumed to be a delayed process. On the other hand, the artificial neural networks model, as a data-driven approach, tries to model these two processes together with no assumptions. Specifically, feedforward backpropagation networks have shown their advantages over analytical models in fault number predictions. In this paper, the following approach is explored. First, recurrent neural networks are applied to model these two processes together. Within this framework, a systematic networks configuration approach is developed with genetic algorithm according to the prediction performance. In order to provide robust predictions, an extra factor characterizing the dispersion of prediction repetitions is incorporated into the performance function. Comparisons with feedforward neural networks and analytical models are developed with respect to a real data set

  2. Model Driven Engineering with Ontology Technologies

    Science.gov (United States)

    Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva

    Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.

  3. Test Driven Development of Scientific Models

    Science.gov (United States)

    Clune, Thomas L.

    2014-01-01

    Test-Driven Development (TDD), a software development process that promises many advantages for developer productivity and software reliability, has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices.After a brief overview of the TDD process and my experience in applying the methodology for development activities at Goddard, I will delve more deeply into some of the challenges that are posed by numerical and scientific software as well as tools and implementation approaches that should address those challenges.

  4. Assisted stellar suicide: the wind-driven evolution of the recurrent nova T Pyxidis

    Science.gov (United States)

    Knigge, Ch.; King, A. R.; Patterson, J.

    2000-12-01

    We show that the extremely high luminosity of the short-period recurrent nova T Pyx in quiescence can be understood if this system is a wind-driven supersoft x-ray source (SSS). In this scenario, a strong, radiation-induced wind is excited from the secondary star and accelerates the binary evolution. The accretion rate is therefore much higher than in an ordinary cataclysmic binary at the same orbital period, as is the luminosity of the white dwarf primary. In the steady state, the enhanced luminosity is just sufficient to maintain the wind from the secondary. The accretion rate and luminosity predicted by the wind-driven model for T Pyx are in good agreement with the observational evidence. X-ray observations with Chandra or XMM may be able to confirm T Pyx's status as a SSS. T Pyx's lifetime in the wind-driven state is on the order of a million years. Its ultimate fate is not certain, but the system may very well end up destroying itself, either via the complete evaporation of the secondary star, or in a Type Ia supernova if the white dwarf reaches the Chandrasekhar limit. Thus either the primary, the secondary, or both may currently be committing assisted stellar suicide.

  5. Possibilities and limitations of applying software reliability growth models to safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2007-01-01

    It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

  6. Tractable flux-driven temperature, density, and rotation profile evolution with the quasilinear gyrokinetic transport model QuaLiKiz

    Science.gov (United States)

    Citrin, J.; Bourdelle, C.; Casson, F. J.; Angioni, C.; Bonanomi, N.; Camenen, Y.; Garbet, X.; Garzotti, L.; Görler, T.; Gürcan, O.; Koechl, F.; Imbeaux, F.; Linder, O.; van de Plassche, K.; Strand, P.; Szepesi, G.; Contributors, JET

    2017-12-01

    Quasilinear turbulent transport models are a successful tool for prediction of core tokamak plasma profiles in many regimes. Their success hinges on the reproduction of local nonlinear gyrokinetic fluxes. We focus on significant progress in the quasilinear gyrokinetic transport model QuaLiKiz (Bourdelle et al 2016 Plasma Phys. Control. Fusion 58 014036), which employs an approximated solution of the mode structures to significantly speed up computation time compared to full linear gyrokinetic solvers. Optimisation of the dispersion relation solution algorithm within integrated modelling applications leads to flux calculations × {10}6-7 faster than local nonlinear simulations. This allows tractable simulation of flux-driven dynamic profile evolution including all transport channels: ion and electron heat, main particles, impurities, and momentum. Furthermore, QuaLiKiz now includes the impact of rotation and temperature anisotropy induced poloidal asymmetry on heavy impurity transport, important for W-transport applications. Application within the JETTO integrated modelling code results in 1 s of JET plasma simulation within 10 h using 10 CPUs. Simultaneous predictions of core density, temperature, and toroidal rotation profiles for both JET hybrid and baseline experiments are presented, covering both ion and electron turbulence scales. The simulations are successfully compared to measured profiles, with agreement mostly in the 5%-25% range according to standard figures of merit. QuaLiKiz is now open source and available at www.qualikiz.com.

  7. Concrete syntax definition for modeling languages

    OpenAIRE

    Fondement, Frédéric; Baar, Thomas

    2008-01-01

    Model Driven Engineering (MDE) promotes the use of models as primary artefacts of a software development process, as an attempt to handle complexity through abstraction, e.g. to cope with the evolution of execution platforms. MDE follows a stepwise approach, by prescribing to develop abstract models further improved to integrate little by little details relative to the final deployment platforms. Thus, the application of an MDE process results in various models residing at various levels of a...

  8. Concrete syntax definition for modeling languages

    OpenAIRE

    Fondement, Frédéric

    2007-01-01

    Model Driven Engineering (MDE) promotes the use of models as primary artefacts of a software development process, as an attempt to handle complexity through abstraction, e.g. to cope with the evolution of execution platforms. MDE follows a stepwise approach, by prescribing to develop abstract models further improved to integrate little by little details relative to the final deployment platforms. Thus, the application of an MDE process results in various models residing at various levels of a...

  9. Software Cost-Estimation Model

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.

  10. Energy-driven surface evolution in beta-MnO2 structures

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Wentao; Yuan, Yifei; Asayesh-Ardakani, Hasti; Huang, Zhennan; Long, Fei; Friedrich, Craig; Amine, Khalil; Lu, Jun; Shahbazian-Yassar, Reza

    2018-01-01

    Exposed crystal facets directly affect the electrochemical/catalytic performance of MnO2 materials during their applications in supercapacitors, rechargeable batteries, and fuel cells. Currently, the facet-controlled synthesis of MnO2 is facing serious challenges due to the lack of an in-depth understanding of their surface evolution mechanisms. Here, combining aberration-corrected scanning transmission electron microscopy (STEM) and high-resolution TEM, we revealed a mutual energy-driven mechanism between beta-MnO2 nanowires and microstructures that dominated the evolution of the lateral facets in both structures. The evolution of the lateral surfaces followed the elimination of the {100} facets and increased the occupancy of {110} facets with the increase in hydrothermal retention time. Both self-growth and oriented attachment along their {100} facets were observed as two different ways to reduce the surface energies of the beta-MnO2 structures. High-density screw dislocations with the 1/2 < 100 > Burgers vector were generated consequently. The observed surface evolution phenomenon offers guidance for the facet-controlled growth of beta-MnO2 materials with high performances for its application in metal-air batteries, fuel cells, supercapacitors, etc.

  11. A Model-Driven Framework to Develop Personalized Health Monitoring

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-07-01

    Full Text Available Both distributed healthcare systems and the Internet of Things (IoT are currently hot topics. The latter is a new computing paradigm to enable advanced capabilities in engineering various applications, including those for healthcare. For such systems, the core social requirement is the privacy/security of the patient information along with the technical requirements (e.g., energy consumption and capabilities for adaptability and personalization. Typically, the functionality of the systems is predefined by the patient’s data collected using sensor networks along with medical instrumentation; then, the data is transferred through the Internet for treatment and decision-making. Therefore, systems creation is indeed challenging. In this paper, we propose a model-driven framework to develop the IoT-based prototype and its reference architecture for personalized health monitoring (PHM applications. The framework contains a multi-layered structure with feature-based modeling and feature model transformations at the top and the application software generation at the bottom. We have validated the framework using available tools and developed an experimental PHM to test some aspects of the functionality of the reference architecture in real time. The main contribution of the paper is the development of the model-driven computational framework with emphasis on the synergistic effect of security and energy issues.

  12. Supernova-driven outflows and chemical evolution of dwarf spheroidal galaxies.

    Science.gov (United States)

    Qian, Yong-Zhong; Wasserburg, G J

    2012-03-27

    We present a general phenomenological model for the metallicity distribution (MD) in terms of [Fe/H] for dwarf spheroidal galaxies (dSphs). These galaxies appear to have stopped accreting gas from the intergalactic medium and are fossilized systems with their stars undergoing slow internal evolution. For a wide variety of infall histories of unprocessed baryonic matter to feed star formation, most of the observed MDs can be well described by our model. The key requirement is that the fraction of the gas mass lost by supernova-driven outflows is close to unity. This model also predicts a relationship between the total stellar mass and the mean metallicity for dSphs in accord with properties of their dark matter halos. The model further predicts as a natural consequence that the abundance ratios [E/Fe] for elements such as O, Mg, and Si decrease for stellar populations at the higher end of the [Fe/H] range in a dSph. We show that, for infall rates far below the net rate of gas loss to star formation and outflows, the MD in our model is very sharply peaked at one [Fe/H] value, similar to what is observed in most globular clusters. This result suggests that globular clusters may be end members of the same family as dSphs.

  13. Comparative empirical analysis of flow-weighted transit route networks in R-space and evolution modeling

    Science.gov (United States)

    Huang, Ailing; Zang, Guangzhi; He, Zhengbing; Guan, Wei

    2017-05-01

    Urban public transit system is a typical mixed complex network with dynamic flow, and its evolution should be a process coupling topological structure with flow dynamics, which has received little attention. This paper presents the R-space to make a comparative empirical analysis on Beijing’s flow-weighted transit route network (TRN) and we found that both the Beijing’s TRNs in the year of 2011 and 2015 exhibit the scale-free properties. As such, we propose an evolution model driven by flow to simulate the development of TRNs with consideration of the passengers’ dynamical behaviors triggered by topological change. The model simulates that the evolution of TRN is an iterative process. At each time step, a certain number of new routes are generated driven by travel demands, which leads to dynamical evolution of new routes’ flow and triggers perturbation in nearby routes that will further impact the next round of opening new routes. We present the theoretical analysis based on the mean-field theory, as well as the numerical simulation for this model. The results obtained agree well with our empirical analysis results, which indicate that our model can simulate the TRN evolution with scale-free properties for distributions of node’s strength and degree. The purpose of this paper is to illustrate the global evolutional mechanism of transit network that will be used to exploit planning and design strategies for real TRNs.

  14. A new practice-driven approach to develop software in a cyber-physical system environment

    Science.gov (United States)

    Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei

    2016-02-01

    Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.

  15. Analytical method of CIM to PIM transformation in Model Driven Architecture (MDA

    Directory of Open Access Journals (Sweden)

    Martin Kardos

    2010-06-01

    Full Text Available Information system’s models on higher level of abstraction have become a daily routine in many software companies. The concept of Model Driven Architecture (MDA published by standardization body OMG1 since 2001 has become a concept for creation of software applications and information systems. MDA specifies four levels of abstraction: top three levels are created as graphical models and the last one as implementation code model. Many research works of MDA are focusing on the lower levels and transformations between each other. The top level of abstraction, called Computation Independent Model (CIM and its transformation to the lower level called Platform Independent Model (PIM is not so extensive research topic. Considering to a great importance and usability of this level in practice of IS2Keywords: transformation, MDA, CIM, PIM, UML, DFD. development now our research activity is focused to this highest level of abstraction – CIM and its possible transformation to the lower PIM level. In this article we are presenting a possible solution of CIM modeling and its analytic method of transformation to PIM.

  16. Generic domain models in software engineering

    Science.gov (United States)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  17. Particle force model effects in a shock-driven multiphase instability

    Science.gov (United States)

    Black, W. J.; Denissen, N.; McFarland, J. A.

    2018-05-01

    This work presents simulations on a shock-driven multiphase instability (SDMI) at an initial particle volume fraction of 1% with the addition of a suite of particle force models applicable in dense flows. These models include pressure-gradient, added-mass, and interparticle force terms in an effort to capture the effects neighboring particles have in non-dilute flow regimes. Two studies are presented here: the first seeks to investigate the individual contributions of the force models, while the second study focuses on examining the effect of these force models on the hydrodynamic evolution of a SDMI with various particle relaxation times (particle sizes). In the force study, it was found that the pressure gradient and interparticle forces have little effect on the instability under the conditions examined, while the added-mass force decreases the vorticity deposition and alters the morphology of the instability. The relaxation-time study likewise showed a decrease in metrics associated with the evolution of the SDMI for all sizes when the particle force models were included. The inclusion of these models showed significant morphological differences in both the particle and carrier species fields, which increased as particle relaxation times increased.

  18. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  19. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  20. A generic open-source software framework supporting scenario simulations in bioterrorist crises.

    Science.gov (United States)

    Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie

    2013-09-01

    Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.

  1. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  2. Standards Interoperability: Application of Contemporary Software Safety Assurance Standards to the Evolution of Legacy Software

    National Research Council Canada - National Science Library

    Meacham, Desmond J

    2006-01-01

    .... The proposed formal model is then applied to the requirements for RTCA DO-178B and MIL-STD-498 as representative examples of contemporary and legacy software standards. The results provide guidance on how to achieve airworthiness certification for modified legacy software, whilst maximizing the use of software products from the previous development.

  3. Data Driven Economic Model Predictive Control

    Directory of Open Access Journals (Sweden)

    Masoud Kheradmandi

    2018-04-01

    Full Text Available This manuscript addresses the problem of data driven model based economic model predictive control (MPC design. To this end, first, a data-driven Lyapunov-based MPC is designed, and shown to be capable of stabilizing a system at an unstable equilibrium point. The data driven Lyapunov-based MPC utilizes a linear time invariant (LTI model cognizant of the fact that the training data, owing to the unstable nature of the equilibrium point, has to be obtained from closed-loop operation or experiments. Simulation results are first presented demonstrating closed-loop stability under the proposed data-driven Lyapunov-based MPC. The underlying data-driven model is then utilized as the basis to design an economic MPC. The economic improvements yielded by the proposed method are illustrated through simulations on a nonlinear chemical process system example.

  4. Mutational landscape of EGFR-, MYC-, and Kras-driven genetically engineered mouse models of lung adenocarcinoma.

    Science.gov (United States)

    McFadden, David G; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K; Song, Xiaoling; Pirun, Mono; Santiago, Philip M; Kim-Kiselak, Caroline; Platt, James T; Lee, Emily; Hodges, Emily; Rosebrock, Adam P; Bronson, Roderick T; Socci, Nicholas D; Hannon, Gregory J; Jacks, Tyler; Varmus, Harold

    2016-10-18

    Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity.

  5. Mutational landscape of EGFR-, MYC-, and Kras-driven genetically engineered mouse models of lung adenocarcinoma

    Science.gov (United States)

    McFadden, David G.; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K.; Song, Xiaoling; Pirun, Mono; Santiago, Philip M.; Kim-Kiselak, Caroline; Platt, James T.; Lee, Emily; Hodges, Emily; Rosebrock, Adam P.; Bronson, Roderick T.; Socci, Nicholas D.; Hannon, Gregory J.; Jacks, Tyler; Varmus, Harold

    2016-01-01

    Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity. PMID:27702896

  6. Modelling Coupled Processes in the Evolution of Repository Engineered Barrier Systems using QPAC-EBS

    Energy Technology Data Exchange (ETDEWEB)

    Maul, Philip; Benbow, Steven; Bond, Alex; Robinson, Peter (Quintessa Limited, Henley-on-Thames (United Kingdom))

    2010-08-15

    A satisfactory understanding of the evolution of repository engineered barrier systems (EBS) is an essential part of the safety case for the repository. This involves consideration of coupled Thermal (T), Hydro (H), Mechanical (M) and Chemical (C) processes. Quintessa's general-purpose modelling code QPAC is capable of representing strongly coupled non-linear processes and has been used in a wide range of applications. This code is the basis for software used by Quintessa in studies of the evolution of the EBS in a deep repository for spent nuclear fuel undertaken for SKI and then SSM since 2007. The collection of software components employed has been referred to collectively as QPAC-EBS, consisting of the core QPAC code together with relevant modules for T, H, M and C processes. QPAC-EBS employs a fundamentally different approach from dedicated codes that model such processes (although few codes can represent each type of process), enabling the specification of new processes and the associated governing equations in code input. Studies undertaken to date have demonstrated that QPAC-EBS can be used effectively to investigate both the early evolution of the EBS and important scenarios for the later evolution of the system when buffer erosion and canister corrosion may occur. A key issue for modelling EBS evolution is the satisfactory modelling of the behaviour of the bentonite buffer. Bentonite is a difficult material to model, partly because of the complex coupled mechanical, hydro and chemical processes involved in swelling during resaturation. Models employed to date have generally taken an empirical approach, but a new model developed during the EU THERESA project could be further developed to provide a better representation of these processes. QPAC-EBS could play an important role in supporting SSM.s review of the forthcoming SR-Site assessment by SKB if used by Quintessa in independent supporting calculations. To date radionuclide transport calculations

  7. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  8. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  9. Footprints of directional selection in wild Atlantic salmon populations: evidence for parasite-driven evolution?

    Science.gov (United States)

    Zueva, Ksenia J; Lumme, Jaakko; Veselov, Alexey E; Kent, Matthew P; Lien, Sigbjørn; Primmer, Craig R

    2014-01-01

    Mechanisms of host-parasite co-adaptation have long been of interest in evolutionary biology; however, determining the genetic basis of parasite resistance has been challenging. Current advances in genome technologies provide new opportunities for obtaining a genome-scale view of the action of parasite-driven natural selection in wild populations and thus facilitate the search for specific genomic regions underlying inter-population differences in pathogen response. European populations of Atlantic salmon (Salmo salar L.) exhibit natural variance in susceptibility levels to the ectoparasite Gyrodactylus salaris Malmberg 1957, ranging from resistance to extreme susceptibility, and are therefore a good model for studying the evolution of virulence and resistance. However, distinguishing the molecular signatures of genetic drift and environment-associated selection in small populations such as land-locked Atlantic salmon populations presents a challenge, specifically in the search for pathogen-driven selection. We used a novel genome-scan analysis approach that enabled us to i) identify signals of selection in salmon populations affected by varying levels of genetic drift and ii) separate potentially selected loci into the categories of pathogen (G. salaris)-driven selection and selection acting upon other environmental characteristics. A total of 4631 single nucleotide polymorphisms (SNPs) were screened in Atlantic salmon from 12 different northern European populations. We identified three genomic regions potentially affected by parasite-driven selection, as well as three regions presumably affected by salinity-driven directional selection. Functional annotation of candidate SNPs is consistent with the role of the detected genomic regions in immune defence and, implicitly, in osmoregulation. These results provide new insights into the genetic basis of pathogen susceptibility in Atlantic salmon and will enable future searches for the specific genes involved.

  10. Modelling Laccoliths: Fluid-Driven Fracturing in the Lab

    Science.gov (United States)

    Ball, T. V.; Neufeld, J. A.

    2017-12-01

    Current modelling of the formation of laccoliths neglects the necessity to fracture rock layers for propagation to occur [1]. In magmatic intrusions at depth the idea of fracture toughness is used to characterise fracturing, however an analogue for near surface intrusions has yet to be explored [2]. We propose an analytical model for laccolith emplacement that accounts for the energy required to fracture at the tip of an intrusion. For realistic physical parameters we find that a lag region exists between the fluid magma front and the crack tip where large negative pressures in the tip cause volatiles to exsolve from the magma. Crucially, the dynamics of this tip region controls the spreading due to the competition between viscous forces and fracture energy. We conduct a series of complementary experiments to investigate fluid-driven fracturing of adhered layers and confirm the existence of two regimes: viscosity dominant spreading, controlled by the pressure in the lag region, and fracture energy dominant spreading, controlled by the energy required to fracture layers. Our experiments provide the first observations, and evolution, of a vapour tip. These experiments and our simplified model provide insight into the key physical processes in near surface magmatic intrusions with applications to fluid-driven fracturing more generally. Michaut J. Geophys. Res. 116(B5), B05205. Bunger & Cruden J. Geophys. Res. 116(B2), B02203.

  11. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  12. CLAIRE, an event-driven simulation tool for testing software

    International Nuclear Information System (INIS)

    Raguideau, J.; Schoen, D.; Henry, J.Y.; Boulc'h, J.

    1994-06-01

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.)

  13. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2006-01-01

    As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

  14. Implementation, Comparison and Application of an Average Simulation Model of a Wind Turbine Driven Doubly Fed Induction Generator

    Directory of Open Access Journals (Sweden)

    Lidula N. Widanagama Arachchige

    2017-10-01

    Full Text Available Wind turbine driven doubly-fed induction generators (DFIGs are widely used in the wind power industry. With the increasing penetration of wind farms, analysis of their effect on power systems has become a critical requirement. This paper presents the modeling of wind turbine driven DFIGs using the conventional vector controls in a detailed model of a DFIG that represents power electronics (PE converters with device level models and proposes an average model eliminating the PE converters. The PSCAD/EMTDC™ (4.6 electromagnetic transient simulation software is used to develop the detailed and the proposing average model of a DFIG. The comparison of the two models reveals that the designed average DFIG model is adequate for simulating and analyzing most of the transient conditions.

  15. A novel industry grade dataset for fault prediction based on model-driven developed automotive embedded software

    NARCIS (Netherlands)

    Altinger, H.; Siegl, S.; Dajsuren, Y.; Wotawa, F.

    2015-01-01

    In this paper, we present a novel industry dataset on static software and change metrics for Matlab/Simulink models and their corresponding auto-generated C source code. The data set comprises data of three automotive projects developed and tested accordingly to industry standards and restrictive

  16. Multi-Level Formation of Complex Software Systems

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-05-01

    Full Text Available We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems as multi-level networks is more realistic. In particular, the software networks are organized by three levels of granularity, which represents the modularity and hierarchy in the formation process of real-world software systems. More importantly, simulations based on this model have generated more realistic structural properties of software networks, such as power-law, clustering and modularization. On the basis of this model, how the structure of software systems effects software design principles is then explored, and it could be helpful for understanding software evolution and software engineering practices.

  17. Impact of Internet of Things on Software Business Model and Software Industry

    OpenAIRE

    Murari, Bhanu Teja

    2016-01-01

    Context: Internet of things (IoT) technology is rapidly increasing and changes the business environment for a software organization. There is a need to understand what are important factors of business model should a software company focus on obtaining benefits from the potential that IoT offers. This thesis also focuses on finding the impact of IoT on software business model and software industry especially on software development. Objectives: In this thesis, we do research on IoT software b...

  18. DYNAMICALLY DRIVEN EVOLUTION OF THE INTERSTELLAR MEDIUM IN M51

    International Nuclear Information System (INIS)

    Koda, Jin; Scoville, Nick; Potts, Ashley E.; Carpenter, John M.; Corder, Stuartt A.; Patience, Jenny; Sargent, Anneila I.; Sawada, Tsuyoshi; La Vigne, Misty A.; Vogel, Stuart N.; White, Stephen M.; Zauderer, B. Ashley; Pound, Marc W.; Wright, Melvyn C. H.; Plambeck, Richard L.; Bock, Douglas C. J.; Hawkins, David; Hodges, Mark; Lamb, James W.; Kemball, Athol

    2009-01-01

    Massive star formation occurs in giant molecular clouds (GMCs); an understanding of the evolution of GMCs is a prerequisite to develop theories of star formation and galaxy evolution. We report the highest-fidelity observations of the grand-design spiral galaxy M51 in carbon monoxide (CO) emission, revealing the evolution of GMCs vis-a-vis the large-scale galactic structure and dynamics. The most massive GMCs (giant molecular associations (GMAs)) are first assembled and then broken up as the gas flow through the spiral arms. The GMAs and their H 2 molecules are not fully dissociated into atomic gas as predicted in stellar feedback scenarios, but are fragmented into smaller GMCs upon leaving the spiral arms. The remnants of GMAs are detected as the chains of GMCs that emerge from the spiral arms into interarm regions. The kinematic shear within the spiral arms is sufficient to unbind the GMAs against self-gravity. We conclude that the evolution of GMCs is driven by large-scale galactic dynamics-their coagulation into GMAs is due to spiral arm streaming motions upon entering the arms, followed by fragmentation due to shear as they leave the arms on the downstream side. In M51, the majority of the gas remains molecular from arm entry through the interarm region and into the next spiral arm passage.

  19. Integrated modeling of software cost and quality

    International Nuclear Information System (INIS)

    Rone, K.Y.; Olson, K.M.

    1994-01-01

    In modeling the cost and quality of software systems, the relationship between cost and quality must be considered. This explicit relationship is dictated by the criticality of the software being developed. The balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and the developers with respect to the processes being employed

  20. The MOOC Hype: Can We Ignore It? : Reflections on the Current Use of Massive Open Online Courses in Software Modeling Education

    NARCIS (Netherlands)

    Stikkolorum, D.R.; Demuth, B.; Zaytsev, V.; Boulanger, F.; Gray, J.; Demuth, B.; Stikkolorum, D.

    2014-01-01

    At the end of the 2014 MODELS Educators Symposium, a panel discussed the topic of the use of MOOCs (Massive Open Online Courses) in model-driven engineering education with the audience. Currently, there are no MOOCs that target software modeling. The panel argued if it would be worthwhile to

  1. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  2. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    Science.gov (United States)

    2011-01-01

    Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting

  3. Good Practices for Software Configuration Management with MDA Metodology

    OpenAIRE

    Manuel Morejón Espinosa

    2012-01-01

    Software Configuration Management (SCM) forms part of the software development process. Its principal goal is to coordinate this development and minimize all possible errors. In order to meet its goal various activities are carried out, of which can be identified: items identification, change control, version control, audit and status reporting. Inside enterprise applications the software development can be guided from system model as methodology. The name of this methodology is Model Driven ...

  4. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  5. An Assessment between Software Development Life Cycle Models of Software Engineering

    OpenAIRE

    Er. KESHAV VERMA; Er. PRAMOD KUMAR; Er. MOHIT KUMAR; Er.GYANESH TIWARI

    2013-01-01

    This research deals with an essential and important subject in Digital world. It is related with the software managing processes that inspect the part of software development during the development models, which are called as software development life cycle. It shows five of the development models namely, waterfall, Iteration, V-shaped, spiral and Extreme programming. These models have advantages and disadvantages as well. So, the main objective of this research is to represent dissimilar mod...

  6. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  7. Model-Driven Theme/UML

    Science.gov (United States)

    Carton, Andrew; Driver, Cormac; Jackson, Andrew; Clarke, Siobhán

    Theme/UML is an existing approach to aspect-oriented modelling that supports the modularisation and composition of concerns, including crosscutting ones, in design. To date, its lack of integration with model-driven engineering (MDE) techniques has limited its benefits across the development lifecycle. Here, we describe our work on facilitating the use of Theme/UML as part of an MDE process. We have developed a transformation tool that adopts model-driven architecture (MDA) standards. It defines a concern composition mechanism, implemented as a model transformation, to support the enhanced modularisation features of Theme/UML. We evaluate our approach by applying it to the development of mobile, context-aware applications-an application area characterised by many non-functional requirements that manifest themselves as crosscutting concerns.

  8. SQIMSO: Quality Improvement for Small Software Organizations

    OpenAIRE

    Rabih Zeineddine; Nashat Mansour

    2005-01-01

    Software quality improvement process remains incomplete if it is not initiated and conducted through a wide improvement program that considers process quality improvement, product quality improvement and evolution of human resources. But, small software organizations are not capable of bearing the cost of establishing software process improvement programs. In this work, we propose a new software quality improvement model for small organizations, SQIMSO, based on three ...

  9. A refinement driven component-based design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2007-01-01

    the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may be integrated in computer-aided software engineering (CASE) tools for adding formally supported...

  10. The art of software modeling

    CERN Document Server

    Lieberman, Benjamin A

    2007-01-01

    Modeling complex systems is a difficult challenge and all too often one in which modelers are left to their own devices. Using a multidisciplinary approach, The Art of Software Modeling covers theory, practice, and presentation in detail. It focuses on the importance of model creation and demonstrates how to create meaningful models. Presenting three self-contained sections, the text examines the background of modeling and frameworks for organizing information. It identifies techniques for researching and capturing client and system information and addresses the challenges of presenting models to specific audiences. Using concepts from art theory and aesthetics, this broad-based approach encompasses software practices, cognitive science, and information presentation. The book also looks at perception and cognition of diagrams, view composition, color theory, and presentation techniques. Providing practical methods for investigating and organizing complex information, The Art of Software Modeling demonstrate...

  11. Positioning performance of the NTCM model driven by GPS Klobuchar model parameters

    Science.gov (United States)

    Hoque, Mohammed Mainul; Jakowski, Norbert; Berdermann, Jens

    2018-03-01

    Users of the Global Positioning System (GPS) utilize the Ionospheric Correction Algorithm (ICA) also known as Klobuchar model for correcting ionospheric signal delay or range error. Recently, we developed an ionosphere correction algorithm called NTCM-Klobpar model for single frequency GNSS applications. The model is driven by a parameter computed from GPS Klobuchar model and consecutively can be used instead of the GPS Klobuchar model for ionospheric corrections. In the presented work we compare the positioning solutions obtained using NTCM-Klobpar with those using the Klobuchar model. Our investigation using worldwide ground GPS data from a quiet and a perturbed ionospheric and geomagnetic activity period of 17 days each shows that the 24-hour prediction performance of the NTCM-Klobpar is better than the GPS Klobuchar model in global average. The root mean squared deviation of the 3D position errors are found to be about 0.24 and 0.45 m less for the NTCM-Klobpar compared to the GPS Klobuchar model during quiet and perturbed condition, respectively. The presented algorithm has the potential to continuously improve the accuracy of GPS single frequency mass market devices with only little software modification.

  12. Chaste: A test-driven approach to software development for biological modelling

    KAUST Repository

    Pitt-Francis, Joe

    2009-12-01

    Chaste (\\'Cancer, heart and soft-tissue environment\\') is a software library and a set of test suites for computational simulations in the domain of biology. Current functionality has arisen from modelling in the fields of cancer, cardiac physiology and soft-tissue mechanics. It is released under the LGPL 2.1 licence. Chaste has been developed using agile programming methods. The project began in 2005 when it was reasoned that the modelling of a variety of physiological phenomena required both a generic mathematical modelling framework, and a generic computational/simulation framework. The Chaste project evolved from the Integrative Biology (IB) e-Science Project, an inter-institutional project aimed at developing a suitable IT infrastructure to support physiome-level computational modelling, with a primary focus on cardiac and cancer modelling. Program summary: Program title: Chaste. Catalogue identifier: AEFD_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEFD_v1_0.html. Program obtainable from: CPC Program Library, Queen\\'s University, Belfast, N. Ireland. Licensing provisions: LGPL 2.1. No. of lines in distributed program, including test data, etc.: 5 407 321. No. of bytes in distributed program, including test data, etc.: 42 004 554. Distribution format: tar.gz. Programming language: C++. Operating system: Unix. Has the code been vectorised or parallelized?: Yes. Parallelized using MPI. RAM:< 90   Megabytes for two of the scenarios described in Section 6 of the manuscript (Monodomain re-entry on a slab or Cylindrical crypt simulation). Up to 16 Gigabytes (distributed across processors) for full resolution bidomain cardiac simulation. Classification: 3. External routines: Boost, CodeSynthesis XSD, CxxTest, HDF5, METIS, MPI, PETSc, Triangle, Xerces. Nature of problem: Chaste may be used for solving coupled ODE and PDE systems arising from modelling biological systems. Use of Chaste in two application areas are described in this paper: cardiac

  13. Adaptive Multiscale Modeling of Geochemical Impacts on Fracture Evolution

    Science.gov (United States)

    Molins, S.; Trebotich, D.; Steefel, C. I.; Deng, H.

    2016-12-01

    Understanding fracture evolution is essential for many subsurface energy applications, including subsurface storage, shale gas production, fracking, CO2 sequestration, and geothermal energy extraction. Geochemical processes in particular play a significant role in the evolution of fractures through dissolution-driven widening, fines migration, and/or fracture sealing due to precipitation. One obstacle to understanding and exploiting geochemical fracture evolution is that it is a multiscale process. However, current geochemical modeling of fractures cannot capture this multi-scale nature of geochemical and mechanical impacts on fracture evolution, and is limited to either a continuum or pore-scale representation. Conventional continuum-scale models treat fractures as preferential flow paths, with their permeability evolving as a function (often, a cubic law) of the fracture aperture. This approach has the limitation that it oversimplifies flow within the fracture in its omission of pore scale effects while also assuming well-mixed conditions. More recently, pore-scale models along with advanced characterization techniques have allowed for accurate simulations of flow and reactive transport within the pore space (Molins et al., 2014, 2015). However, these models, even with high performance computing, are currently limited in their ability to treat tractable domain sizes (Steefel et al., 2013). Thus, there is a critical need to develop an adaptive modeling capability that can account for separate properties and processes, emergent and otherwise, in the fracture and the rock matrix at different spatial scales. Here we present an adaptive modeling capability that treats geochemical impacts on fracture evolution within a single multiscale framework. Model development makes use of the high performance simulation capability, Chombo-Crunch, leveraged by high resolution characterization and experiments. The modeling framework is based on the adaptive capability in Chombo

  14. PROTOPLANETARY DISK STRUCTURE WITH GRAIN EVOLUTION: THE ANDES MODEL

    International Nuclear Information System (INIS)

    Akimkin, V.; Wiebe, D.; Pavlyuchenkov, Ya.; Zhukovska, S.; Semenov, D.; Henning, Th.; Vasyunin, A.; Birnstiel, T.

    2013-01-01

    We present a self-consistent model of a protoplanetary disk: 'ANDES' ('AccretioN disk with Dust Evolution and Sedimentation'). ANDES is based on a flexible and extendable modular structure that includes (1) a 1+1D frequency-dependent continuum radiative transfer module, (2) a module to calculate the chemical evolution using an extended gas-grain network with UV/X-ray-driven processes and surface reactions, (3) a module to calculate the gas thermal energy balance, and (4) a 1+1D module that simulates dust grain evolution. For the first time, grain evolution and time-dependent molecular chemistry are included in a protoplanetary disk model. We find that grain growth and sedimentation of large grains onto the disk midplane lead to a dust-depleted atmosphere. Consequently, dust and gas temperatures become higher in the inner disk (R ∼ 50 AU), in comparison with the disk model with pristine dust. The response of disk chemical structure to the dust growth and sedimentation is twofold. First, due to higher transparency a partly UV-shielded molecular layer is shifted closer to the dense midplane. Second, the presence of big grains in the disk midplane delays the freeze-out of volatile gas-phase species such as CO there, while in adjacent upper layers the depletion is still effective. Molecular concentrations and thus column densities of many species are enhanced in the disk model with dust evolution, e.g., CO 2 , NH 2 CN, HNO, H 2 O, HCOOH, HCN, and CO. We also show that time-dependent chemistry is important for a proper description of gas thermal balance.

  15. Software vulnerability: Definition, modelling, and practical evaluation for E-mail: transfer software

    International Nuclear Information System (INIS)

    Kimura, Mitsuhiro

    2006-01-01

    This paper proposes a method of assessing software vulnerability quantitatively. By expanding the concept of the IPO (input-program-output) model, we first define the software vulnerability and construct a stochastic model. Then we evaluate the software vulnerability of the sendmail system by analyzing the actual security-hole data, which were collected from its release note. Also we show the relationship between the estimated software reliability and vulnerability of the analyzed system

  16. Influence of helical external driven current on nonlinear resistive tearing mode evolution and saturation in tokamaks

    Science.gov (United States)

    Zhang, W.; Wang, S.; Ma, Z. W.

    2017-06-01

    The influences of helical driven currents on nonlinear resistive tearing mode evolution and saturation are studied by using a three-dimensional toroidal resistive magnetohydrodynamic code (CLT). We carried out three types of helical driven currents: stationary, time-dependent amplitude, and thickness. It is found that the helical driven current is much more efficient than the Gaussian driven current used in our previous study [S. Wang et al., Phys. Plasmas 23(5), 052503 (2016)]. The stationary helical driven current cannot persistently control tearing mode instabilities. For the time-dependent helical driven current with f c d = 0.01 and δ c d < 0.04 , the island size can be reduced to its saturated level that is about one third of the initial island size. However, if the total driven current increases to about 7% of the total plasma current, tearing mode instabilities will rebound again due to the excitation of the triple tearing mode. For the helical driven current with time dependent strength and thickness, the reduction speed of the radial perturbation component of the magnetic field increases with an increase in the driven current and then saturates at a quite low level. The tearing mode is always controlled even for a large driven current.

  17. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  18. Studies and analyses of the space shuttle main engine. Failure information propagation model data base and software

    Science.gov (United States)

    Tischer, A. E.

    1987-01-01

    The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.

  19. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  20. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  1. Characteristics and evolution of the ecosystem of software tools supporting research in molecular biology.

    Science.gov (United States)

    Pazos, Florencio; Chagoyen, Monica

    2018-01-16

    Daily work in molecular biology presently depends on a large number of computational tools. An in-depth, large-scale study of that 'ecosystem' of Web tools, its characteristics, interconnectivity, patterns of usage/citation, temporal evolution and rate of decay is crucial for understanding the forces that shape it and for informing initiatives aimed at its funding, long-term maintenance and improvement. In particular, the long-term maintenance of these tools is compromised because of their specific development model. Hundreds of published studies become irreproducible de facto, as the software tools used to conduct them become unavailable. In this study, we present a large-scale survey of >5400 publications describing Web servers within the two main bibliographic resources for disseminating new software developments in molecular biology. For all these servers, we studied their citation patterns, the subjects they address, their citation networks and the temporal evolution of these factors. We also analysed how these factors affect the availability of these servers (whether they are alive). Our results show that this ecosystem of tools is highly interconnected and adapts to the 'trendy' subjects in every moment. The servers present characteristic temporal patterns of citation/usage, and there is a worrying rate of server 'death', which is influenced by factors such as the server popularity and the institutions that hosts it. These results can inform initiatives aimed at the long-term maintenance of these resources. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Common Practices from Two Decades of Water Resources Modelling Published in Environmental Modelling & Software: 1997 to 2016

    Science.gov (United States)

    Ames, D. P.; Peterson, M.; Larsen, J.

    2016-12-01

    A steady flow of manuscripts describing integrated water resources management (IWRM) modelling has been published in Environmental Modelling & Software since the journal's inaugural issue in 1997. These papers represent two decades of peer-reviewed scientific knowledge regarding methods, practices, and protocols for conducting IWRM. We have undertaken to explore this specific assemblage of literature with the intention of identifying commonly reported procedures in terms of data integration methods, modelling techniques, approaches to stakeholder participation, means of communication of model results, and other elements of the model development and application life cycle. Initial results from this effort will be presented including a summary of commonly used practices, and their evolution over the past two decades. We anticipate that results will show a pattern of movement toward greater use of both stakeholder/participatory modelling methods as well as increased use of automated methods for data integration and model preparation. Interestingly, such results could be interpreted to show that the availability of better, faster, and more integrated software tools and technologies free the modeler to take a less technocratic and more human approach to water resources modelling.

  3. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  4. Climate change-driven cliff and beach evolution at decadal to centennial time scales

    Science.gov (United States)

    Erikson, Li; O'Neill, Andrea; Barnard, Patrick; Vitousek, Sean; Limber, Patrick

    2017-01-01

    Here we develop a computationally efficient method that evolves cross-shore profiles of sand beaches with or without cliffs along natural and urban coastal environments and across expansive geographic areas at decadal to centennial time-scales driven by 21st century climate change projections. The model requires projected sea level rise rates, extrema of nearshore wave conditions, bluff recession and shoreline change rates, and cross-shore profiles representing present-day conditions. The model is applied to the ~470-km long coast of the Southern California Bight, USA, using recently available projected nearshore waves and bluff recession and shoreline change rates. The results indicate that eroded cliff material, from unarmored cliffs, contribute 11% to 26% to the total sediment budget. Historical beach nourishment rates will need to increase by more than 30% for a 0.25 m sea level rise (~2044) and by at least 75% by the year 2100 for a 1 m sea level rise, if evolution of the shoreline is to keep pace with rising sea levels.

  5. Diffusive smoothing of surfzone bathymetry by gravity-driven sediment transport

    Science.gov (United States)

    Moulton, M. R.; Elgar, S.; Raubenheimer, B.

    2012-12-01

    Gravity-driven sediment transport often is assumed to have a small effect on the evolution of nearshore morphology. Here, it is shown that down-slope gravity-driven sediment transport is an important process acting to smooth steep bathymetric features in the surfzone. Gravity-driven transport can be modeled as a diffusive term in the sediment continuity equation governing temporal (t) changes in bed level (h): ∂h/∂t ≈ κ ▽2h, where κ is a sediment diffusion coefficient that is a function of the bed shear stress (τb) and sediment properties, such as the grain size and the angle of repose. Field observations of waves, currents, and the evolution of large excavated holes (initially 10-m wide and 2-m deep, with sides as steep as 35°) in an energetic surfzone are consistent with diffusive smoothing by gravity. Specifically, comparisons of κ estimated from the measured bed evolution with those estimated with numerical model results for several transport theories suggest that gravity-driven sediment transport dominates the bed evolution, with κ proportional to a power of τb. The models are initiated with observed bathymetry and forced with observed waves and currents. The diffusion coefficients from the measurements and from the model simulations were on average of order 10-5 m2/s, implying evolution time scales of days for features with length scales of 10 m. The dependence of κ on τb varies for different transport theories and for high and low shear stress regimes. The US Army Corps of Engineers Field Research Facility, Duck, NC provided excellent logistical support. Funded by a National Security Science and Engineering Faculty Fellowship, a National Defense Science and Engineering Graduate Fellowship, and the Office of Naval Research.

  6. Study on Software Quality Improvement based on Rayleigh Model and PDCA Model

    OpenAIRE

    Ning Jingfeng; Hu Ming

    2013-01-01

    As the software industry gradually becomes mature, software quality is regarded as the life of a software enterprise. This article discusses how to improve the quality of software, applies Rayleigh model and PDCA model to the software quality management, combines with the defect removal effectiveness index, exerts PDCA model to solve the problem of quality management objectives when using the Rayleigh model in bidirectional quality improvement strategies of software quality management, a...

  7. Documentation Driven Development for Complex Real-Time Systems

    Science.gov (United States)

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  8. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    of the development process. The main contributions presented in the thesis have evolved from work with two of the hypotheses: These address the problems of management of evolution, and overview, comprehension and navigation respectively. The first main contribution is the Architectural Software Configuration...... Management Model: A software configuration management model where the abstractions and hierarchy of the logical aspect of software architecture forms the basis for version control and configuration management. The second main contribution is the Geographic Space Architecture Visualisation Model......: A visualisation model where entities in a software architecture are organised geographically in a two-dimensional plane, their visual appearance determined by processing a subset of the data in the entities, and interaction with the project's underlying data performed by direct manipulation of the landscape...

  9. Concurrent Design of Embedded Control Software

    NARCIS (Netherlands)

    Groothuis, M.A.; Frijns, Raymond; Voeten, Jeroen; Broenink, Johannes F.; Margaria, T.; Padberg, J.; Taentzer, G.; Levendovszky, T.; Lengyel, L.; Karsai, G.; Hardebolle, C.

    2009-01-01

    Embedded software design for mechatronic systems is becoming an increasingly time-consuming and error-prone task. In order to cope with the heterogeneity and complexity, a systematic model-driven design approach is needed, where several parts of the system can be designed concurrently. There is

  10. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  11. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  12. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  13. Weakly Nonlinear Model with Exact Coefficients for the Fluttering and Spiraling Motion of Buoyancy-Driven Bodies

    Science.gov (United States)

    Tchoufag, Joël; Fabre, David; Magnaudet, Jacques

    2015-09-01

    Gravity- or buoyancy-driven bodies moving in a slightly viscous fluid frequently follow fluttering or helical paths. Current models of such systems are largely empirical and fail to predict several of the key features of their evolution, especially close to the onset of path instability. Here, using a weakly nonlinear expansion of the full set of governing equations, we present a new generic reduced-order model based on a pair of amplitude equations with exact coefficients that drive the evolution of the first pair of unstable modes. We show that the predictions of this model for the style (e.g., fluttering or spiraling) and characteristics (e.g., frequency and maximum inclination angle) of path oscillations compare well with various recent data for both solid disks and air bubbles.

  14. On Rank Driven Dynamical Systems

    Science.gov (United States)

    Veerman, J. J. P.; Prieto, F. J.

    2014-08-01

    We investigate a class of models related to the Bak-Sneppen (BS) model, initially proposed to study evolution. The BS model is extremely simple and yet captures some forms of "complex behavior" such as self-organized criticality that is often observed in physical and biological systems. In this model, random fitnesses in are associated to agents located at the vertices of a graph . Their fitnesses are ranked from worst (0) to best (1). At every time-step the agent with the worst fitness and some others with a priori given rank probabilities are replaced by new agents with random fitnesses. We consider two cases: The exogenous case where the new fitnesses are taken from an a priori fixed distribution, and the endogenous case where the new fitnesses are taken from the current distribution as it evolves. We approximate the dynamics by making a simplifying independence assumption. We use Order Statistics and Dynamical Systems to define a rank-driven dynamical system that approximates the evolution of the distribution of the fitnesses in these rank-driven models, as well as in the BS model. For this simplified model we can find the limiting marginal distribution as a function of the initial conditions. Agreement with experimental results of the BS model is excellent.

  15. Network evolution driven by dynamics applied to graph coloring

    International Nuclear Information System (INIS)

    Wu Jian-She; Li Li-Guang; Yu Xin; Jiao Li-Cheng; Wang Xiao-Hua

    2013-01-01

    An evolutionary network driven by dynamics is studied and applied to the graph coloring problem. From an initial structure, both the topology and the coupling weights evolve according to the dynamics. On the other hand, the dynamics of the network are determined by the topology and the coupling weights, so an interesting structure-dynamics co-evolutionary scheme appears. By providing two evolutionary strategies, a network described by the complement of a graph will evolve into several clusters of nodes according to their dynamics. The nodes in each cluster can be assigned the same color and nodes in different clusters assigned different colors. In this way, a co-evolution phenomenon is applied to the graph coloring problem. The proposed scheme is tested on several benchmark graphs for graph coloring

  16. Object Oriented Modeling : A method for combining model and software development

    NARCIS (Netherlands)

    Van Lelyveld, W.

    2010-01-01

    When requirements for a new model cannot be met by available modeling software, new software can be developed for a specific model. Methods for the development of both model and software exist, but a method for combined development has not been found. A compatible way of thinking is required to

  17. Evolution of calculation models for the proton-therapy dose planning software

    International Nuclear Information System (INIS)

    Vidal, Marie

    2011-01-01

    This work was achieved in collaboration between the Institut Curie Proton-therapy Center of Orsay (ICPO), the DOSIsoft company and the CREATIS laboratory, in order to develop a new dose calculation model for the new ICPO treatment room. A new accelerator and gantry room from the IBA company were installed during the up-grade project of the proton-therapy center, with the intention of enlarging the cancer localizations treated at ICPO. Developing a package of methods and new dose calculation algorithms to adapt them to the new specific characteristics of the delivered beams by the IBA system is the first goal of this PhD work. They all aim to be implemented in the DOSIsoft treatment planning software, Isogray. First, the double scattering technique is treated in taking into account major differences between the IBA system and the ICPO fixed beam lines passive system. Secondly, a model is explored for the scanned beams modality. The second objective of this work is improving the Ray-Tracing and Pencil-Beam dose calculation models already in use. For the double scattering and uniform scanning techniques, the patient personalized collimator at the end of the beam line causes indeed a patient dose distribution contamination. A reduction method of that phenomenon was set up for the passive beam system. An analytical model was developed which describes the contamination function with parameters validated through Monte-Carlo simulations on the GATE platform. It allows us to apply those methods to active scanned beams. (author) [fr

  18. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  19. Predicting Software Suitability Using a Bayesian Belief Network

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  20. Bump evolution driven by the x-ray ablation Richtmyer-Meshkov effect in plastic inertial confinement fusion Ablators

    Directory of Open Access Journals (Sweden)

    Loomis Eric

    2013-11-01

    Full Text Available Growth of hydrodynamic instabilities at the interfaces of inertial confinement fusion capsules (ICF due to ablator and fuel non-uniformities are a primary concern for the ICF program. Recently, observed jetting and parasitic mix into the fuel were attributed to isolated defects on the outer surface of the capsule. Strategies for mitigation of these defects exist, however, they require reduced uncertainties in Equation of State (EOS models prior to invoking them. In light of this, we have begun a campaign to measure the growth of isolated defects (bumps due to x-ray ablation Richtmyer-Meshkov in plastic ablators to validate these models. Experiments used hohlraums with radiation temperatures near 70 eV driven by 15 beams from the Omega laser (Laboratory for Laser Energetics, University of Rochester, NY, which sent a ∼1.25Mbar shock into a planar CH target placed over one laser entrance hole. Targets consisted of 2-D arrays of quasi-gaussian bumps (10 microns tall, 34 microns FWHM deposited on the surface facing into the hohlraum. On-axis radiography with a saran (Cl Heα − 2.76keV backlighter was used to measure bump evolution prior to shock breakout. Shock speed measurements were also performed to determine target conditions. Simulations using the LEOS 5310 and SESAME 7592 models required the simulated laser power be turned down to 80 and 88%, respectively to match observed shock speeds. Both LEOS 5310 and SESAME 7592 simulations agreed with measured bump areal densities out to 6 ns where ablative RM oscillations were observed in previous laser-driven experiments, but did not occur in the x-ray driven case. The QEOS model, conversely, over predicted shock speeds and under predicted areal density in the bump.

  1. A theory and model for the evolution of software services

    NARCIS (Netherlands)

    Andrikopoulos, V.

    2010-01-01

    Software services are subject to constant change and variation. To control service development, a service developer needs to know why a change was made, what are its implications and whether the change is complete. Typically, service clients do not perceive the upgraded service immediately. As a

  2. A two level mutation-selection model of cultural evolution and diversity.

    Science.gov (United States)

    Salazar-Ciudad, Isaac

    2010-11-21

    Cultural evolution is a complex process that can happen at several levels. At the level of individuals in a population, each human bears a set of cultural traits that he or she can transmit to its offspring (vertical transmission) or to other members of his or her society (horizontal transmission). The relative frequency of a cultural trait in a population or society can thus increase or decrease with the relative reproductive success of its bearers (individual's level) or the relative success of transmission (called the idea's level). This article presents a mathematical model on the interplay between these two levels. The first aim of this article is to explore when cultural evolution is driven by the idea's level, when it is driven by the individual's level and when it is driven by both. These three possibilities are explored in relation to (a) the amount of interchange of cultural traits between individuals, (b) the selective pressure acting on individuals, (c) the rate of production of new cultural traits, (d) the individual's capacity to remember cultural traits and to the population size. The aim is to explore the conditions in which cultural evolution does not lead to a better adaptation of individuals to the environment. This is to contrast the spread of fitness-enhancing ideas, which make individual bearers better adapted to the environment, to the spread of "selfish" ideas, which spread well simply because they are easy to remember but do not help their individual bearers (and may even hurt them). At the same time this article explores in which conditions the adaptation of individuals is maximal. The second aim is to explore how these factors affect cultural diversity, or the amount of different cultural traits in a population. This study suggests that a larger interchange of cultural traits between populations could lead to cultural evolution not improving the adaptation of individuals to their environment and to a decrease of cultural diversity

  3. Hysteresis-controlled instability waves in a scale-free driven current sheet model

    Directory of Open Access Journals (Sweden)

    V. M. Uritsky

    2005-01-01

    Full Text Available Magnetospheric dynamics is a complex multiscale process whose statistical features can be successfully reproduced using high-dimensional numerical transport models exhibiting the phenomenon of self-organized criticality (SOC. Along this line of research, a 2-dimensional driven current sheet (DCS model has recently been developed that incorporates an idealized current-driven instability with a resistive MHD plasma system (Klimas et al., 2004a, b. The dynamics of the DCS model is dominated by the scale-free diffusive energy transport characterized by a set of broadband power-law distribution functions similar to those governing the evolution of multiscale precipitation regions of energetic particles in the nighttime sector of aurora (Uritsky et al., 2002b. The scale-free DCS behavior is supported by localized current-driven instabilities that can communicate in an avalanche fashion over arbitrarily long distances thus producing current sheet waves (CSW. In this paper, we derive the analytical expression for CSW speed as a function of plasma parameters controlling local anomalous resistivity dynamics. The obtained relation indicates that the CSW propagation requires sufficiently high initial current densities, and predicts a deceleration of CSWs moving from inner plasma sheet regions toward its northern and southern boundaries. We also show that the shape of time-averaged current density profile in the DCS model is in agreement with steady-state spatial configuration of critical avalanching models as described by the singular diffusion theory of the SOC. Over shorter time scales, SOC dynamics is associated with rather complex spatial patterns and, in particular, can produce bifurcated current sheets often seen in multi-satellite observations.

  4. Generative Models in Deep Learning: Constraints for Galaxy Evolution

    Science.gov (United States)

    Turp, Maximilian Dennis; Schawinski, Kevin; Zhang, Ce; Weigel, Anna K.

    2018-01-01

    New techniques are essential to make advances in the field of galaxy evolution. Recent developments in the field of artificial intelligence and machine learning have proven that these tools can be applied to problems far more complex than simple image recognition. We use these purely data driven approaches to investigate the process of star formation quenching. We show that Variational Autoencoders provide a powerful method to forward model the process of galaxy quenching. Our results imply that simple changes in specific star formation rate and bulge to disk ratio cannot fully describe the properties of the quenched population.

  5. Software Startups - A Research Agenda

    Directory of Open Access Journals (Sweden)

    Michael Unterkalmsteiner

    2016-10-01

    Full Text Available Software startup companies develop innovative, software-intensive products within limited time frames and with few resources, searching for sustainable and scalable business models. Software startups are quite distinct from traditional mature software companies, but also from micro-, small-, and medium-sized enterprises, introducing new challenges relevant for software engineering research. This paper's research agenda focuses on software engineering in startups, identifying, in particular, 70+ research questions in the areas of supporting startup engineering activities, startup evolution models and patterns, ecosystems and innovation hubs, human aspects in software startups, applying startup concepts in non-startup environments, and methodologies and theories for startup research. We connect and motivate this research agenda with past studies in software startup research, while pointing out possible future directions. While all authors of this research agenda have their main background in Software Engineering or Computer Science, their interest in software startups broadens the perspective to the challenges, but also to the opportunities that emerge from multi-disciplinary research. Our audience is therefore primarily software engineering researchers, even though we aim at stimulating collaborations and research that crosses disciplinary boundaries. We believe that with this research agenda we cover a wide spectrum of the software startup industry current needs.

  6. Small is beautiful: customer driven software development

    DEFF Research Database (Denmark)

    Hansen, Henrik A.B.; Koch, Christian; Pleman, Allan

    1999-01-01

    to develop their software. In small software houses operating in markets with complex products such as ERP (enterprise resource planning) systems, networking is necessary in order to gain the needed knowledge and resources in the production development process. Network is not seen as a magic word but leads......Summary form only given. The topics addressed in this paper is how networking can be used as a way for small software houses to enhances their innovative capabilities by using different kinds of collaboration in order to overcome the problems of lacking knowledge as well as resources in order...

  7. A weakly nonlinear model with exact coefficients for the fluttering and spiraling motions of buoyancy-driven bodies

    Science.gov (United States)

    Magnaudet, Jacques; Tchoufag, Joel; Fabre, David

    2015-11-01

    Gravity/buoyancy-driven bodies moving in a slightly viscous fluid frequently follow fluttering or helical paths. Current models of such systems are largely empirical and fail to predict several of the key features of their evolution, especially close to the onset of path instability. Using a weakly nonlinear expansion of the full set of governing equations, we derive a new generic reduced-order model of this class of phenomena based on a pair of amplitude equations with exact coefficients that drive the evolution of the first pair of unstable modes. We show that the predictions of this model for the style (eg. fluttering or spiraling) and characteristics (eg. frequency and maximum inclination angle) of path oscillations compare well with various recent data for both solid disks and air bubbles.

  8. Designing an optimal software intensive system acquisition: A game theoretic approach

    Science.gov (United States)

    Buettner, Douglas John

    The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi

  9. A Two Species Bump-On-Tail Model With Relaxation for Energetic Particle Driven Modes

    Science.gov (United States)

    Aslanyan, V.; Porkolab, M.; Sharapov, S. E.; Spong, D. A.

    2017-10-01

    Energetic particle driven Alfvén Eigenmodes (AEs) observed in present day experiments exhibit various nonlinear behaviours varying from steady state amplitude at a fixed frequency to bursting amplitudes and sweeping frequency. Using the appropriate action-angle variables, the problem of resonant wave-particle interaction becomes effectively one-dimensional. Previously, a simple one-dimensional Bump-On-Tail (BOT) model has proven to be one of the most effective in describing characteristic nonlinear near-threshold wave evolution scenarios. In particular, dynamical friction causes bursting mode evolution, while diffusive relaxation may give steady-state, periodic or chaotic mode evolution. BOT has now been extended to include two populations of fast particles, with one dominated by dynamical friction at the resonance and the other by diffusion; the relative size of the populations determines the temporal evolution of the resulting wave. This suggests an explanation for recent observations on the TJ-II stellarator, where a transition between steady state and bursting occured as the magnetic configuration varied. The two species model is then applied to burning plasma with drag-dominated alpha particles and diffusion-dominated ICRH accelerated minority ions. This work was supported by the US DoE and the RCUK Energy Programme [Grant Number EP/P012450/1].

  10. PATTERN-BASED AND REUSE-DRIVEN ARCHITECTING OF MOBILE CLOUD SOFTWARE

    OpenAIRE

    Aakash Ahmad1 , Ahmed B. Altamimi1 , Abdulrahman Alreshidi1 , Mohammad T. Alshammari1 , Numra Saeed2 , Jamal M. Aqib1

    2018-01-01

    Context: Mobile Cloud Computing (MCC) represents the state-of-the-art technology that unifies mobile computing and cloud computing to develop systems that are portable yet resource sufficient. Mobile computing allows portable communication and context-aware computation, however, due to the energy and resource constraints mobile computing lacks performance for computationally intensive tasks. Cloud computing model uses the ‘as a service’ model - providing hardware and software services - to of...

  11. PATTERN-BASED AND REUSE-DRIVEN ARCHITECTING OF MOBILE CLOUD SOFTWARE

    OpenAIRE

    Aakash Ahmad

    2017-01-01

    Context: Mobile Cloud Computing (MCC) represents the state-of-the-art technology that unifies mobile computing and cloud computing to develop systems that are portable yet resource sufficient. Mobile computing allows portable communication and context-aware computation, however, due to the energy and resource constraints mobile computing lacks performance for computationally intensive tasks. Cloud computing model uses the ‘as a service’ model - providing hardware and software services - to of...

  12. Evolution of the 'Trick' Dynamic Software Executive and Model Libraries for Reusable Flight Software, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to a need for cost-effective small satellite missions, Odyssey Space Research is proposing the development of a common flight software executive and a...

  13. COMPARATIVE ANALYSIS OF SOFTWARE DEVELOPMENT MODELS

    OpenAIRE

    Sandeep Kaur*

    2017-01-01

    No geek is unfamiliar with the concept of software development life cycle (SDLC). This research deals with the various SDLC models covering waterfall, spiral, and iterative, agile, V-shaped, prototype model. In the modern era, all the software systems are fallible as they can’t stand with certainty. So, it is tried to compare all aspects of the various models, their pros and cons so that it could be easy to choose a particular model at the time of need

  14. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  15. Building Models of Installations to Recommend Applications in IoT Software Ecosystems

    DEFF Research Database (Denmark)

    Tomlein, Matus; Grønbæk, Kaj

    2016-01-01

    Internet of Things devices are driven by applications. To stimulate innovation on their platforms, manufacturers of embedded products are creating software ecosystems that enable third-parties develop applications for their devices. Since such applications are developed externally, their seamless...

  16. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  17. Lessons Learned in Software Testing A Context-Driven Approach

    CERN Document Server

    Kaner, Cem; Pettichord, Bret

    2008-01-01

    Decades of software testing experience condensed into the most important lessons learned.The world's leading software testing experts lend you their wisdom and years of experience to help you avoid the most common mistakes in testing software. Each lesson is an assertion related to software testing, followed by an explanation or example that shows you the how, when, and why of the testing lesson. More than just tips, tricks, and pitfalls to avoid, Lessons Learned in Software Testing speeds you through the critical testing phase of the software development project without the extensive trial an

  18. Study of the nonlinear imperfect software debugging model

    International Nuclear Information System (INIS)

    Wang, Jinyong; Wu, Zhibo

    2016-01-01

    In recent years there has been a dramatic proliferation of research on imperfect software debugging phenomena. Software debugging is a complex process and is affected by a variety of factors, including the environment, resources, personnel skills, and personnel psychologies. Therefore, the simple assumption that debugging is perfect is inconsistent with the actual software debugging process, wherein a new fault can be introduced when removing a fault. Furthermore, the fault introduction process is nonlinear, and the cumulative number of nonlinearly introduced faults increases over time. Thus, this paper proposes a nonlinear, NHPP imperfect software debugging model in consideration of the fact that fault introduction is a nonlinear process. The fitting and predictive power of the NHPP-based proposed model are validated through related experiments. Experimental results show that this model displays better fitting and predicting performance than the traditional NHPP-based perfect and imperfect software debugging models. S-confidence bounds are set to analyze the performance of the proposed model. This study also examines and discusses optimal software release-time policy comprehensively. In addition, this research on the nonlinear process of fault introduction is significant given the recent surge of studies on software-intensive products, such as cloud computing and big data. - Highlights: • Fault introduction is a nonlinear changing process during the debugging phase. • The assumption that the process of fault introduction is nonlinear is credible. • Our proposed model can better fit and accurately predict software failure behavior. • Research on fault introduction case is significant to software-intensive products.

  19. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  20. A SYSTEMATIC STUDY OF SOFTWARE QUALITY MODELS

    OpenAIRE

    Dr.Vilas. M. Thakare; Ashwin B. Tomar

    2011-01-01

    This paper aims to provide a basis for software quality model research, through a systematic study ofpapers. It identifies nearly seventy software quality research papers from journals and classifies paper asper research topic, estimation approach, study context and data set. The paper results combined withother knowledge provides support for recommendations in future software quality model research, toincrease the area of search for relevant studies, carefully select the papers within a set ...

  1. Radiobiological modelling with MarCell software

    International Nuclear Information System (INIS)

    Hasan, J.S.; Jones, T.D.

    1996-01-01

    Jones introduced a bone marrow radiation cell kinetics model with great potential for application in the fields of health physics, radiation research, and medicine. However, until recently, only the model developers have been able to apply it because of the complex array of biological and physical assignments needed for evaluation of a particular radiation exposure protocol. The purpose of this article is to illustrate the use of MarCell (MARrow CELL Kinetics) software for MS-DOS, a user-friendly computer implementation of that mathematical model that allows almost anyone with an elementary knowledge of radiation physics and/or medical procedures to apply the model. A hands-on demonstration of the software will be given by guiding the user through evaluation of a medical total body irradiation protocol and a nuclear fallout scenario. A brief overview of the software is given in the Appendix

  2. A Python Calculator for Supernova Remnant Evolution

    Science.gov (United States)

    Leahy, D. A.; Williams, J. E.

    2017-05-01

    A freely available Python code for modeling supernova remnant (SNR) evolution has been created. This software is intended for two purposes: to understand SNR evolution and to use in modeling observations of SNR for obtaining good estimates of SNR properties. It includes all phases for the standard path of evolution for spherically symmetric SNRs. In addition, alternate evolutionary models are available, including evolution in a cloudy ISM, the fractional energy-loss model, and evolution in a hot low-density ISM. The graphical interface takes in various parameters and produces outputs such as shock radius and velocity versus time, as well as SNR surface brightness profile and spectrum. Some interesting properties of SNR evolution are demonstrated using the program.

  3. Proposing a Qualitative Approach for Corporate Competitive Capability Modeling in High-Tech Business (Case study: Software Industry

    Directory of Open Access Journals (Sweden)

    Mahmoud Saremi Saremi

    2010-09-01

    Full Text Available The evolution of global business trend for ICT-based products in recent decades shows the intensive activity of pioneer developing countries to gain a powerful competitive position in global software industry. In this research, with regard to importance of competition issue for top managers of Iranian software companies, a conceptual model has been developed for Corporate Competitive Capability concept. First, after describing the research problem, we present a comparative review of recent theories of firm and competition that has been applied by different researchers in the High-Tech and Knowledge Intensive Organization filed. Afterwards, with a detailed review of literature and previous research papers, an initial research framework and applied research method has been proposed. The main and final section of paper assigned to describing the result of research in different steps of qualitative modeling process. The agreed concepts are related to corporate competitive capability, the elicited and analyzed experts Cause Map, the elicited collective causal maps, and the final proposed model for software industry are the modeling results for this paper.

  4. Geometry of quantal adiabatic evolution driven by a non-Hermitian Hamiltonian

    International Nuclear Information System (INIS)

    Wu Zhaoyan; Yu Ting; Zhou Hongwei

    1994-01-01

    It is shown by using a counter example, which is exactly solvable, that the quantal adiabatic theorem does not generally hold for a non-Hermitian driving Hamiltonian, even if it varies extremely slowly. The condition for the quantal adiabatic theorem to hold for non-Hermitian driving Hamiltonians is given. The adiabatic evolutions driven by a non-Hermitian Hamiltonian provide examples of a new geometric structure, that is the vector bundle in which the inner product of two parallelly transported vectors generally changes. A new geometric concept, the attenuation tensor, is naturally introduced to describe the decay or flourish of the open quantum system. It is constructed in terms of the spectral projector of the Hamiltonian. (orig.)

  5. A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty

    Science.gov (United States)

    Friedel, Michael J.

    2011-01-01

    This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.

  6. Model-driven Service Engineering with SoaML

    Science.gov (United States)

    Elvesæter, Brian; Carrez, Cyril; Mohagheghi, Parastoo; Berre, Arne-Jørgen; Johnsen, Svein G.; Solberg, Arnor

    This chapter presents a model-driven service engineering (MDSE) methodology that uses OMG MDA specifications such as BMM, BPMN and SoaML to identify and specify services within a service-oriented architecture. The methodology takes advantage of business modelling practices and provides a guide to service modelling with SoaML. The presentation is case-driven and illuminated using the telecommunication example. The chapter focuses in particular on the use of the SoaML modelling language as a means for expressing service specifications that are aligned with business models and can be realized in different platform technologies.

  7. Model-driven approach to data collection and reporting for quality improvement.

    Science.gov (United States)

    Curcin, Vasa; Woodcock, Thomas; Poots, Alan J; Majeed, Azeem; Bell, Derek

    2014-12-01

    Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO)...

  9. Optical modeling of induction-linac driven free-electron lasers

    International Nuclear Information System (INIS)

    Scharlemann, E.T.; Fawley, W.M.

    1986-01-01

    The free-electron laser (FEL) simulation code FRED, developed at Lawrence Livermore National Laboratory (LLNL) primarily to model single-pass FEL amplifiers driven by induction linear accelerators, is described. The main emphasis is on the modeling of optical propagation in the laser and on the differences between the requirements for modeling rf-linac-driven vs. induction-linac-driven FELs. Examples of optical guiding and mode cleanup are presented for a 50 μm FEL

  10. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  11. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) Software Development: Applications, Infrastructure, and Middleware/Networks

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-06-30

    The status of and future plans for the Program for Climate Model Diagnosis and Intercomparison (PCMDI) hinge on software that PCMDI is either currently distributing or plans to distribute to the climate community in the near future. These software products include standard conventions, national and international federated infrastructures, and community analysis and visualization tools. This report also mentions other secondary software not necessarily led by or developed at PCMDI to provide a complete picture of the overarching applications, infrastructures, and middleware/networks. Much of the software described anticipates the use of future technologies envisioned over the span of next year to 10 years. These technologies, together with the software, will be the catalyst required to address extreme-scale data warehousing, scalability issues, and service-level requirements for a diverse set of well-known projects essential for predicting climate change. These tools, unlike the previous static analysis tools of the past, will support the co-existence of many users in a productive, shared virtual environment. This advanced technological world driven by extreme-scale computing and the data it generates will increase scientists’ productivity, exploit national and international relationships, and push research to new levels of understanding.

  12. Evolving software products, the design of a water-related modeling software ecosystem

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2017-01-01

    more than 50 years ago. However, a radical change of software products to evolve both in the software engineering as much as the organizational and business aspects in a disruptive manner are rather rare. In this paper, we report on the transformation of one of the market leader product series in water......-related calculation and modeling from a traditional business-as-usual series of products to an evolutionary software ecosystem. We do so by relying on existing concepts on software ecosystem analysis to analyze the future ecosystem. We report and elaborate on the main focus points necessary for this transition. We...... argue for the generalization of our focus points to the transition from traditional business-as-usual software products to software ecosystems....

  13. Model-Driven Approach for Body Area Network Application Development.

    Science.gov (United States)

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-05-12

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  14. A model of nitrous oxide evolution from soil driven by rainfall events. I - Model structure and sensitivity. II - Model applications

    Science.gov (United States)

    Changsheng, LI; Frolking, Steve; Frolking, Tod A.

    1992-01-01

    Simulations of N2O and CO2 emissions from soils were conducted with a rain-event driven, process-oriented model (DNDC) of nitrogen and carbon cycling processes in soils. The magnitude and trends of simulated N2O (or N2O + N2) and CO2 emissions were consistent with the results obtained in field experiments. The successful simulation of these emissions from the range of soil types examined demonstrates that the DNDC will be a useful tool for the study of linkages among climate, soil-atmosphere interactions, land use, and trace gas fluxes.

  15. On Model Based Synthesis of Embedded Control Software

    OpenAIRE

    Alimguzhin, Vadim; Mari, Federico; Melatti, Igor; Salvo, Ivano; Tronci, Enrico

    2012-01-01

    Many Embedded Systems are indeed Software Based Control Systems (SBCSs), that is control systems whose controller consists of control software running on a microcontroller device. This motivates investigation on Formal Model Based Design approaches for control software. Given the formal model of a plant as a Discrete Time Linear Hybrid System and the implementation specifications (that is, number of bits in the Analog-to-Digital (AD) conversion) correct-by-construction control software can be...

  16. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object-o...... be integrated in computer-aided software engineering (CASE) tools for adding formally supported checking, transformation and generation facilities.......Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...

  17. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    Science.gov (United States)

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  18. Software Assurance Curriculum Project Volume 3: Master of Software Assurance Course Syllabi

    Science.gov (United States)

    2011-07-01

    thods and process of model-driven development. • Pressman , Roger S., Software Engineering: A Practitioner’s Approach, 6th ed. McGraw Hill, 2009...audience • [Bishop 2002] Chapter 18 • [Allen 2008] Chapters 1, 2 • [ Pressman 2009] Chapter 1 • [Merkow 2010] Chapter 3 • [Mouratidis 2007...Allen 2008] Chapters 3,4 • [ Pressman 2009] Chapters 3,4 • [Merkow 2010] Chapter 5 • [DHS 2008-2009a] • [Mellado 2010] • [CERT 2009] 3

  19. Toward a User Driven Innovation for Distributed Software Teams

    Science.gov (United States)

    Hossain, Liaquat; Zhou, David

    The software industry has emerged to include some of the most revolutionized distributed work groups; however, not all such groups achieve their set goals and some even fail miserably. The distributed nature of open source software project teams provides an intriguing context for the study of distributed coordination. OSS team structures have traditionally been geographically dispersed and, therefore, the coordination of post-release activities such as testing are made difficult due to the fact that the only means of communication is via electronic forms, such as e-mail or message boards and forums. Nevertheless, large scale, complex, and innovative software packages have been the fruits of labor for some OSS teams set in such coordination-unfriendly environments, while others end in flames. Why are some distributed work groups more effective than others? In our current communication-enriched environment, best practices for coordination are adopted by all software projects yet some still fall by the wayside. Does the team structure have bearing on the success of the project? How does the communication between the team and external parties affect the project's ultimate success or failure? In this study, we seek to answer these questions by applying existing theories from social networks and their analytical methods in the coordination of defect management activities found in OSS projects. We propose the social network based theoretical model for exploring distributed coordination structures and apply that for the case of the OSS defect management process for exploring the structural properties, which induce the greatest coordination performance. The outcome suggests that there is correlation between certain network measures such as density, centrality, and betweenness and coordination performance measures of defect management systems such as quality and timeliness.

  20. Six Sigma software development

    CERN Document Server

    Tayntor, Christine B

    2002-01-01

    Since Six Sigma has had marked success in improving quality in other settings, and since the quality of software remains poor, it seems a natural evolution to apply the concepts and tools of Six Sigma to system development and the IT department. Until now however, there were no books available that applied these concepts to the system development process. Six Sigma Software Development fills this void and illustrates how Six Sigma concepts can be applied to all aspects of the evolving system development process. It includes the traditional waterfall model and in the support of legacy systems,

  1. A Comparison Between Five Models Of Software Engineering

    OpenAIRE

    Nabil Mohammed Ali Munassar; A. Govardhan

    2010-01-01

    This research deals with a vital and important issue in computer world. It is concerned with the software management processes that examine the area of software development through the development models, which are known as software development life cycle. It represents five of the development models namely, waterfall, Iteration, V-shaped, spiral and Extreme programming. These models have advantages and disadvantages as well. Therefore, the main objective of this research is to represent diff...

  2. Just-in-time Database-Driven Web Applications

    Science.gov (United States)

    2003-01-01

    "Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109

  3. Presenting an evaluation model of the trauma registry software.

    Science.gov (United States)

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software

  4. Evolution and adaptation in Pseudomonas aeruginosa biofilms driven by mismatch repair system-deficient mutators.

    Directory of Open Access Journals (Sweden)

    Adela M Luján

    Full Text Available Pseudomonas aeruginosa is an important opportunistic pathogen causing chronic airway infections, especially in cystic fibrosis (CF patients. The majority of the CF patients acquire P. aeruginosa during early childhood, and most of them develop chronic infections resulting in severe lung disease, which are rarely eradicated despite intensive antibiotic therapy. Current knowledge indicates that three major adaptive strategies, biofilm development, phenotypic diversification, and mutator phenotypes [driven by a defective mismatch repair system (MRS], play important roles in P. aeruginosa chronic infections, but the relationship between these strategies is still poorly understood. We have used the flow-cell biofilm model system to investigate the impact of the mutS associated mutator phenotype on development, dynamics, diversification and adaptation of P. aeruginosa biofilms. Through competition experiments we demonstrate for the first time that P. aeruginosa MRS-deficient mutators had enhanced adaptability over wild-type strains when grown in structured biofilms but not as planktonic cells. This advantage was associated with enhanced micro-colony development and increased rates of phenotypic diversification, evidenced by biofilm architecture features and by a wider range and proportion of morphotypic colony variants, respectively. Additionally, morphotypic variants generated in mutator biofilms showed increased competitiveness, providing further evidence for mutator-driven adaptive evolution in the biofilm mode of growth. This work helps to understand the basis for the specific high proportion and role of mutators in chronic infections, where P. aeruginosa develops in biofilm communities.

  5. Control software architecture and operating modes of the Model M-2 maintenance system

    Energy Technology Data Exchange (ETDEWEB)

    Satterlee, P.E. Jr.; Martin, H.L.; Herndon, J.N.

    1984-04-01

    The Model M-2 maintenance system is the first completely digitally controlled servomanipulator. The M-2 system allows dexterous operations to be performed remotely using bilateral force-reflecting master/slave techniques, and its integrated operator interface takes advantage of touch-screen-driven menus to allow selection of all possible operating modes. The control system hardware for this system has been described previously. This paper describes the architecture of the overall control system. The system's various modes of operation are identified, the software implementation of each is described, system diagnostic routines are described, and highlights of the computer-augmented operator interface are discussed. 3 references, 5 figures.

  6. Control software architecture and operating modes of the Model M-2 maintenance system

    International Nuclear Information System (INIS)

    Satterlee, P.E. Jr.; Martin, H.L.; Herndon, J.N.

    1984-04-01

    The Model M-2 maintenance system is the first completely digitally controlled servomanipulator. The M-2 system allows dexterous operations to be performed remotely using bilateral force-reflecting master/slave techniques, and its integrated operator interface takes advantage of touch-screen-driven menus to allow selection of all possible operating modes. The control system hardware for this system has been described previously. This paper describes the architecture of the overall control system. The system's various modes of operation are identified, the software implementation of each is described, system diagnostic routines are described, and highlights of the computer-augmented operator interface are discussed. 3 references, 5 figures

  7. EXPLORING DATA-DRIVEN SPECTRAL MODELS FOR APOGEE M DWARFS

    Science.gov (United States)

    Lua Birky, Jessica; Hogg, David; Burgasser, Adam J.; Jessica Birky

    2018-01-01

    The Cannon (Ness et al. 2015; Casey et al. 2016) is a flexible, data-driven spectral modeling and parameter inference framework, demonstrated on high-resolution Apache Point Galactic Evolution Experiment (APOGEE; λ/Δλ~22,500, 1.5-1.7µm) spectra of giant stars to estimate stellar labels (Teff, logg, [Fe/H], and chemical abundances) to precisions higher than the model-grid pipeline. The lack of reliable stellar parameters reported by the APOGEE pipeline for temperatures less than ~3550K, motivates extension of this approach to M dwarf stars. Using a training set of 51 M dwarfs with spectral types ranging M0-M9 obtained from SDSS optical spectra, we demonstrate that the Cannon can infer spectral types to a precision of +/-0.6 types, making it an effective tool for classifying high-resolution near-infrared spectra. We discuss the potential for extending this work to determine the physical stellar labels Teff, logg, and [Fe/H].This work is supported by the SDSS Faculty and Student (FAST) initiative.

  8. A Study On Traditional And Evolutionary Software Development Models

    Directory of Open Access Journals (Sweden)

    Kamran Rasheed

    2017-07-01

    Full Text Available Today Computing technologies are becoming the pioneers of the organizations and helpful in individual functionality i.e. added to computing device we need to add softwares. Set of instruction or computer program is known as software. The development of software is done through some traditional or some new or evolutionary models. Software development is becoming a key and a successful business nowadays. Without software all hardware is useless. Some collective steps that are performed in the development of these are known as Software development life cycle SDLC. There are some adaptive and predictive models for developing software. Predictive mean already known like WATERFALL Spiral Prototype and V-shaped models while Adaptive model include agile Scrum. All methodologies of both adaptive and predictive have their own procedure and steps. Predictive are Static and Adaptive are dynamic mean change cannot be made to the predictive while adaptive have the capability of changing. The purpose of this study is to get familiar with all these and discuss their uses and steps of development. This discussion will be helpful in deciding which model they should use in which circumstance and what are the development step including in each model.

  9. Agile IT: Thinking in User-Centric Models

    Science.gov (United States)

    Margaria, Tiziana; Steffen, Bernhard

    We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.

  10. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    Science.gov (United States)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-06-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low

  11. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    International Nuclear Information System (INIS)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-01-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R n . An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R d (d<< n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology

  12. Software for modelling groundwater transport and contaminant migration

    International Nuclear Information System (INIS)

    Gishkelyuk, I.A.

    2008-01-01

    Facilities of modern software for modeling of groundwater transport and process of contaminant distribution are considered. Advantages of their application are discussed. The comparative analysis of mathematical modeling software of 'Groundwater modeling system' and 'Earth Science Module' from 'COMSOL Multiphysics' is carried out. (authors)

  13. A model of the evolution of larval feeding rate in Drosophila driven by conflicting energy demands.

    Science.gov (United States)

    Mueller, Laurence D; Barter, Thomas T

    2015-02-01

    Energy allocation is believed to drive trade-offs in life history evolution. We develop a physiological and genetic model of energy allocation that drives evolution of feeding rate in a well-studied model system. In a variety of stressful environments Drosophila larvae adapt by altering their rate of feeding. Drosophila larvae adapted to high levels of ammonia, urea, and the presence of parasitoids evolve lower feeding rates. Larvae adapted to crowded conditions evolve higher feeding rates. Feeding rates should affect gross food intake, metabolic rates, and efficiency of food utilization. We develop a model of larval net energy intake as a function of feeding rates. We show that when there are toxic compounds in the larval food that require energy for detoxification, larvae can maximize their energy intake by slowing their feeding rates. While the reduction in feeding rates may increase development time and decrease competitive ability, we show that genotypes with lower feeding rates can be favored by natural selection if they have a sufficiently elevated viability in the toxic environment. This work shows how a simple phenotype, larval feeding rates, may be of central importance in adaptation to a wide variety of stressful environments via its role in energy allocation.

  14. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  15. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... component of software development process. [25]. It is the ... technologies and stand the test of time. 2.0 Background of ..... costing and time span, and optimization of resource allocation have made long term estimation of ...

  16. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  17. Hydrogeological modelling as a tool for understanding rockslides evolution

    Science.gov (United States)

    Crosta, Giovanni B.; De Caro, Mattia; Frattini, Paolo; Volpi, Giorgio

    2015-04-01

    Several case studies of large rockslides have been presented in the literature showing dependence of displacement rate on seasonal and annual changes of external factors (e.g. rainfall, snowmelt, temperature oscillations) or on human actions (e.g. impounding of landslide toe by artificial lakes, toe excavation). The study of rockslide triggering can focus on either the initial failure or the successive reactivations driven by either meteo-climatic events or other perturbations (e.g. seismic, anthropic). A correlation between groundwater level oscillations and slope movements has been observed at many different sites and in very different materials and slope settings. This seasonal dynamic behavior generally shows a delay between perturbation (e.g., groundwater recharge and increase in water table level) and system reaction (e.g., increase in displacement rate). For this reason, groundwater modeling offers the means for assessing the oscillation of groundwater level which is a major input in rockslide and deep-seated gravitational slope deformation modelling, and that could explain both the initial failure event as well the successive reactivation or the continuous slow motion. Using a finite element software (FEFLOW, WASY GmbH) we developed 2D saturated/unsaturated and steady-state/transient groundwater flow models for two case studies for which a suitable dataset is available: the Vajont rockslide (from 1960 to October 9th 1963) and the Mt. de La Saxe rockslide (2009-2012, Aosta valley; Italian Western Alps). The transient models were implemented starting from hydraulic head distributions simulated in the previous steady-state models to investigate the groundwater fluctuation within the two chosen times interval (Vajont: 1960-1963 ; La Saxe: 2009-2012). Time series of infiltration resulting from precipitation, temperature, snowmelt data (La Saxe rockslide) and reservoir level (Vajont rockslide) were applied to the models. The assumptions made during the

  18. inventory management, VMI, software agents, MDV model

    Directory of Open Access Journals (Sweden)

    Waldemar Wieczerzycki

    2012-03-01

    Full Text Available Background: As it is well know, the implementation of instruments of logistics management is only possible with the use of the latest information technology. So-called agent technology is one of the most promising solutions in this area. Its essence consists in an entirely new way of software distribution on the computer network platform, in which computer exchange among themselves not only data, but also software modules, called just agents. The first aim is to propose the alternative method of the implementation of the concept of the inventory management by the supplier with the use of intelligent software agents, which are able not only to transfer the information but also to make the autonomous decisions based on the privileges given to them. The second aim of this research was to propose a new model of a software agent, which will be both of a high mobility and a high intelligence. Methods: After a brief discussion of the nature of agent technology, the most important benefits of using it to build platforms to support business are given. Then the original model of polymorphic software agent, called Multi-Dimensionally Versioned Software Agent (MDV is presented, which is oriented on the specificity of IT applications in business. MDV agent is polymorphic, which allows the transmission through the network only the most relevant parts of its code, and only when necessary. Consequently, the network nodes exchange small amounts of software code, which ensures high mobility of software agents, and thus highly efficient operation of IT platforms built on the proposed model. Next, the adaptation of MDV software agents to implementation of well-known logistics management instrument - VMI (Vendor Managed Inventory is illustrated. Results: The key benefits of this approach are identified, among which one can distinguish: reduced costs, higher flexibility and efficiency, new functionality - especially addressed to business negotiation, full automation

  19. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  20. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  1. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  2. Architecture-Driven Integration of Modeling Languages for the Design of Software-Intensive Systems

    NARCIS (Netherlands)

    Dos Santos Soares, M.

    2010-01-01

    In the research that led to this thesis a multi-disciplinary approach, combining Traffic Engineering and Software Engineering, was used. Traffic engineers come up with new control strategies and algorithms for improving traffic. Once new solutions are defined from a Traffic Engineering point of

  3. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  4. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  5. Data-Driven Modeling of Complex Systems by means of a Dynamical ANN

    Science.gov (United States)

    Seleznev, A.; Mukhin, D.; Gavrilov, A.; Loskutov, E.; Feigin, A.

    2017-12-01

    The data-driven methods for modeling and prognosis of complex dynamical systems become more and more popular in various fields due to growth of high-resolution data. We distinguish the two basic steps in such an approach: (i) determining the phase subspace of the system, or embedding, from available time series and (ii) constructing an evolution operator acting in this reduced subspace. In this work we suggest a novel approach combining these two steps by means of construction of an artificial neural network (ANN) with special topology. The proposed ANN-based model, on the one hand, projects the data onto a low-dimensional manifold, and, on the other hand, models a dynamical system on this manifold. Actually, this is a recurrent multilayer ANN which has internal dynamics and capable of generating time series. Very important point of the proposed methodology is the optimization of the model allowing us to avoid overfitting: we use Bayesian criterion to optimize the ANN structure and estimate both the degree of evolution operator nonlinearity and the complexity of nonlinear manifold which the data are projected on. The proposed modeling technique will be applied to the analysis of high-dimensional dynamical systems: Lorenz'96 model of atmospheric turbulence, producing high-dimensional space-time chaos, and quasi-geostrophic three-layer model of the Earth's atmosphere with the natural orography, describing the dynamics of synoptical vortexes as well as mesoscale blocking systems. The possibility of application of the proposed methodology to analyze real measured data is also discussed. The study was supported by the Russian Science Foundation (grant #16-12-10198).

  6. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  7. 3DVEM Software Modules for Efficient Management of Point Clouds and Photorealistic 3d Models

    Science.gov (United States)

    Fabado, S.; Seguí, A. E.; Cabrelles, M.; Navarro, S.; García-De-San-Miguel, D.; Lerma, J. L.

    2013-07-01

    Cultural heritage managers in general and information users in particular are not usually used to deal with high-technological hardware and software. On the contrary, information providers of metric surveys are most of the times applying latest developments for real-life conservation and restoration projects. This paper addresses the software issue of handling and managing either 3D point clouds or (photorealistic) 3D models to bridge the gap between information users and information providers as regards the management of information which users and providers share as a tool for decision-making, analysis, visualization and management. There are not many viewers specifically designed to handle, manage and create easily animations of architectural and/or archaeological 3D objects, monuments and sites, among others. 3DVEM - 3D Viewer, Editor & Meter software will be introduced to the scientific community, as well as 3DVEM - Live and 3DVEM - Register. The advantages of managing projects with both sets of data, 3D point cloud and photorealistic 3D models, will be introduced. Different visualizations of true documentation projects in the fields of architecture, archaeology and industry will be presented. Emphasis will be driven to highlight the features of new userfriendly software to manage virtual projects. Furthermore, the easiness of creating controlled interactive animations (both walkthrough and fly-through) by the user either on-the-fly or as a traditional movie file will be demonstrated through 3DVEM - Live.

  8. Model-Driven Approach for Body Area Network Application Development

    Science.gov (United States)

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-01-01

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394

  9. Model-Driven Approach for Body Area Network Application Development

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-05-01

    Full Text Available This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS. We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  10. A Monte Carlo model for 3D grain evolution during welding

    Science.gov (United States)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-09-01

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bézier curves, which allow for the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. The model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.

  11. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  12. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  13. Landscape Evolution Modelling-LAPSUS

    Energy Technology Data Exchange (ETDEWEB)

    Baartman, J. E. M.; Temme, A. J. A. M.; Schoorl, J. M.; Claessens, L.; Viveen, W.; Gorp, W. van; Veldkamp, A.

    2009-07-01

    Landscape evolution modelling can make the consequences of landscape evolution hypotheses explicit and theoretically allows for their falsification and improvement. ideally, landscape evolution models (LEMs) combine the results of all relevant landscape forming processes into an ever-adapting digital landscape (e.g. DEM). These processes may act on different spatial and temporal scales. LAPSUS is such a LEM. Processes that have in different studies been included in LAPSUS are water erosion and deposition, landslide activity, creep, solidification, weathering, tectonics and tillage. Process descriptions are as simple and generic as possible, ensuring wide applicability. (Author) 25 refs.

  14. Landscape Evolution Modelling-LAPSUS

    International Nuclear Information System (INIS)

    Baartman, J. E. M.; Temme, A. J. A. M.; Schoorl, J. M.; Claessens, L.; Viveen, W.; Gorp, W. van; Veldkamp, A.

    2009-01-01

    Landscape evolution modelling can make the consequences of landscape evolution hypotheses explicit and theoretically allows for their falsification and improvement. ideally, landscape evolution models (LEMs) combine the results of all relevant landscape forming processes into an ever-adapting digital landscape (e.g. DEM). These processes may act on different spatial and temporal scales. LAPSUS is such a LEM. Processes that have in different studies been included in LAPSUS are water erosion and deposition, landslide activity, creep, solidification, weathering, tectonics and tillage. Process descriptions are as simple and generic as possible, ensuring wide applicability. (Author) 25 refs.

  15. The Evolution of Software Publication in Astronomy

    Science.gov (United States)

    Cantiello, Matteo

    2018-01-01

    Software is a fundamental component of the scientific research process. As astronomical discoveries increasingly rely on complex numerical calculations and the analysis of big data sets, publishing and documenting software is a fundamental step in ensuring transparency and reproducibility of results. I will briefly discuss the recent history of software publication and highlight the challenges and opportunities ahead.

  16. Evolution of an electron-positron plasma produced by induced gravitational collapse in binary-driven hypernovae

    Directory of Open Access Journals (Sweden)

    Melon Fuksman J. D.

    2018-01-01

    Full Text Available The binary-driven hypernova (BdHN model has been introduced in the past years, to explain a subfamily of gamma-ray bursts (GRBs with energies Eiso ≥ 1052 erg associated with type Ic supernovae. Such BdHNe have as progenitor a tight binary system composed of a carbon-oxigen (CO core and a neutron star undergoing an induced gravitational collapse to a black hole, triggered by the CO core explosion as a supernova (SN. This collapse produces an optically-thick e+e- plasma, which expands and impacts onto the SN ejecta. This process is here considered as a candidate for the production of X-ray flares, which are frequently observed following the prompt emission of GRBs. In this work we follow the evolution of the e+e- plasma as it interacts with the SN ejecta, by solving the equations of relativistic hydrodynamics numerically. Our results are compatible with the Lorentz factors estimated for the sources that produce the flares, of typically Γ ≲ 4.

  17. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2014-01-01

    Highly recommended by JASA, Technometrics, and other journals, the first edition of this bestseller showed how to easily perform complex linear mixed model (LMM) analyses via a variety of software programs. Linear Mixed Models: A Practical Guide Using Statistical Software, Second Edition continues to lead readers step by step through the process of fitting LMMs. This second edition covers additional topics on the application of LMMs that are valuable for data analysts in all fields. It also updates the case studies using the latest versions of the software procedures and provides up-to-date information on the options and features of the software procedures available for fitting LMMs in SAS, SPSS, Stata, R/S-plus, and HLM.New to the Second Edition A new chapter on models with crossed random effects that uses a case study to illustrate software procedures capable of fitting these models Power analysis methods for longitudinal and clustered study designs, including software options for power analyses and suggest...

  18. SOFTWARE DESIGN MODELLING WITH FUNCTIONAL PETRI NETS

    African Journals Online (AJOL)

    Dr Obe

    the system, which can be described as a set of conditions. ... FPN Software prototype proposed for the conventional programming construct: if-then-else ... mathematical modeling tool allowing for ... methods and techniques of software design.

  19. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  20. Collision-model approach to steering of an open driven qubit

    Science.gov (United States)

    Beyer, Konstantin; Luoma, Kimmo; Strunz, Walter T.

    2018-03-01

    We investigate quantum steering of an open quantum system by measurements on its environment in the framework of collision models. As an example we consider a coherently driven qubit dissipatively coupled to a bath. We construct local nonadaptive and adaptive as well as nonlocal measurement scenarios specifying explicitly the measured observable on the environment. Our approach shows transparently how the conditional evolution of the open system depends on the type of the measurement scenario and the measured observables. These can then be optimized for steering. The nonlocal measurement scenario leads to maximal violation of the used steering inequality at zero temperature. Further, we investigate the robustness of the constructed scenarios against thermal noise. We find generally that steering becomes harder at higher temperatures. Surprisingly, the system can be steered even when bipartite entanglement between the system and individual subenvironments vanishes.

  1. The Arizona Universities Library Consortium patron-driven e-book model

    Directory of Open Access Journals (Sweden)

    Jeanne Richardson

    2013-03-01

    Full Text Available Building on Arizona State University's patron-driven acquisitions (PDA initiative in 2009, the Arizona Universities Library Consortium, in partnership with the Ingram Content Group, created a cooperative patron-driven model to acquire electronic books (e-books. The model provides the opportunity for faculty and students at the universities governed by the Arizona Board of Regents (ABOR to access a core of e-books made accessible through resource discovery services and online catalogs. These books are available for significantly less than a single ABOR university would expend for the same materials. The patron-driven model described is one of many evolving models in digital scholarship, and, although the Arizona Universities Library Consortium reports a successful experience, patron-driven models pose questions to stakeholders in the academic publishing industry.

  2. Patterns, principles, and practices of domain-driven design

    CERN Document Server

    Millett, Scott

    2015-01-01

    Methods for managing complex software construction following the practices, principles and patterns of Domain-Driven Design with code examples in C# This book presents the philosophy of Domain-Driven Design (DDD) in a down-to-earth and practical manner for experienced developers building applications for complex domains. A focus is placed on the principles and practices of decomposing a complex problem space as well as the implementation patterns and best practices for shaping a maintainable solution space. You will learn how to build effective domain models through the use of tactical pat

  3. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    Science.gov (United States)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  4. Long-term landscape evolution of the Poços de Caldas Plateau revealed by thermokinematic numerical modeling using the software code Pecube, SE- Brazil

    Science.gov (United States)

    Doranti Tiritan, Carolina; Hackspacher, Peter C.; Glasmacher, Ulrich A.

    2014-05-01

    The Poços de Caldas Plateau in the southeastern Brazil, and it is characterized by a high relief topography supported by the pre-Cambrian crystalline rocks and by the Poços de Caldas Alkaline Massif (PCAM). Ulbrich et al (2002) determine that the ages for the predominant PCAM intermediate rocks were constrained ~83Ma. In addition, geologic observations indicates the phonolites, tinguaites and nepheline syenites were emplaced in a continuous and rapid sequence lasting between 1 to 2 Ma. The topography is characterized by dissected plateau with irregular topographic ridges and peaks with elevations between 900 and 1300m (a.s.l.) on the metamorphic basement and from 1300 to 1700m (a.s.l) on the PCAM region. Therefore, the aim of the work was quantify the main processes that were responsible for the evolution of the landscape by using methods as the low temperature thermochronology and the 3D thermokinematic modeling, for obtaining data of uplift and erosion rates and to correlate them with the thermal gradients of the region. The 3D thermokinematic modeling was obtained using the software code PECUBE (Braun 2003).

  5. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    Science.gov (United States)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  6. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  7. Comparison of Problem Solving from Engineering Design to Software Design

    DEFF Research Database (Denmark)

    Ahmed-Kristensen, Saeema; Babar, Muhammad Ali

    2012-01-01

    Observational studies of engineering design activities can inform the research community on the problem solving models that are employed by professional engineers. Design is defined as an ill-defined problem which includes both engineering design and software design, hence understanding problem...... solving models from other design domains is of interest to the engineering design community. For this paper an observational study of two software design sessions performed for the workshop on “Studying professional Software Design” is compared to analysis from engineering design. These findings provide...... useful insights of how software designers move from a problem domain to a solution domain and the commonalities between software designers’ and engineering designers’ design activities. The software designers were found to move quickly to a detailed design phase, employ co-.evolution and adopt...

  8. Comparison of Problem Solving from Engineering Design to Software Design

    DEFF Research Database (Denmark)

    Ahmed-Kristensen, Saeema; Babar, Muhammad Ali

    2012-01-01

    solving models from other design domains is of interest to the engineering design community. For this paper an observational study of two software design sessions performed for the workshop on “Studying professional Software Design” is compared to analysis from engineering design. These findings provide......Observational studies of engineering design activities can inform the research community on the problem solving models that are employed by professional engineers. Design is defined as an ill-defined problem which includes both engineering design and software design, hence understanding problem...... useful insights of how software designers move from a problem domain to a solution domain and the commonalities between software designers’ and engineering designers’ design activities. The software designers were found to move quickly to a detailed design phase, employ co-.evolution and adopt...

  9. An improved COCOMO software cost estimation model | Duke ...

    African Journals Online (AJOL)

    In this paper, we discuss the methodologies adopted previously in software cost estimation using the COnstructive COst MOdels (COCOMOs). From our analysis, COCOMOs produce very high software development efforts, which eventually produce high software development costs. Consequently, we propose its extension, ...

  10. Byte evolution: software transforming oilpatch operations

    International Nuclear Information System (INIS)

    Roche, P.

    2000-01-01

    Changes in the nature of computer software for tracking exploration and production companies' assets, are discussed. One prediction is that 'industry-specific' software will replace the common electronic spreadsheet, while another foresees business-to-business electronic transactions, and outsourcing of software purchasing and maintenance to 'application service providers' (ASPs). To date, at least two companies have launched their own ASPs; if the trend continues, clients will pay just one monthly fee to the ASP, which will assume the headaches and hassles of software installations, upgrades and maintenance. That would spell the end of in-house networks and information technology people on staff. It is also suggested that in due course business-to-business e-commerce will far exceed in importance the consumer-oriented e-commerce of today. Procurement is a commonly cited example where the electronic exchange of funds and data could replace scores of manual processes. The idea is to simplify business processes through automatic routing among companies via the Internet, with ASPs serving as the central hub of the information flow. Experiences, current products and services, and future plans of the two existing ASP companies, -- Applied Terravision Systems Inc., and QByte Services Ltd. -- are reviewed

  11. Transformation of UML Behavioral Diagrams to Support Software Model Checking

    Directory of Open Access Journals (Sweden)

    Luciana Brasil Rebelo dos Santos

    2014-04-01

    Full Text Available Unified Modeling Language (UML is currently accepted as the standard for modeling (object-oriented software, and its use is increasing in the aerospace industry. Verification and Validation of complex software developed according to UML is not trivial due to complexity of the software itself, and the several different UML models/diagrams that can be used to model behavior and structure of the software. This paper presents an approach to transform up to three different UML behavioral diagrams (sequence, behavioral state machines, and activity into a single Transition System to support Model Checking of software developed in accordance with UML. In our approach, properties are formalized based on use case descriptions. The transformation is done for the NuSMV model checker, but we see the possibility in using other model checkers, such as SPIN. The main contribution of our work is the transformation of a non-formal language (UML to a formal language (language of the NuSMV model checker towards a greater adoption in practice of formal methods in software development.

  12. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  13. A simple model for binary star evolution

    International Nuclear Information System (INIS)

    Whyte, C.A.; Eggleton, P.P.

    1985-01-01

    A simple model for calculating the evolution of binary stars is presented. Detailed stellar evolution calculations of stars undergoing mass and energy transfer at various rates are reported and used to identify the dominant physical processes which determine the type of evolution. These detailed calculations are used to calibrate the simple model and a comparison of calculations using the detailed stellar evolution equations and the simple model is made. Results of the evolution of a few binary systems are reported and compared with previously published calculations using normal stellar evolution programs. (author)

  14. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  15. Models of the atomic nucleus. With interactive software

    International Nuclear Information System (INIS)

    Cook, N.D.

    2006-01-01

    This book-and-CD-software package supplies users with an interactive experience for nuclear visualization via a computer-graphical interface, similar in principle to the molecular visualizations already available in chemistry. Models of the Atomic Nucleus, a largely non-technical introduction to nuclear theory, explains the nucleus in a way that makes nuclear physics as comprehensible as chemistry or cell biology. The book/software supplements virtually any of the current textbooks in nuclear physics by providing a means for 3D visual display of the diverse models of nuclear structure. For the first time, an easy-to-master software for scientific visualization of the nucleus makes this notoriously ''non-visual'' field become immediately 'visible.' After a review of the basics, the book explores and compares the competing models, and addresses how the lattice model best resolves remaining controversies. The appendix explains how to obtain the most from the software provided on the accompanying CD. (orig.)

  16. Development of a methodology for assessing the safety of embedded software systems

    Science.gov (United States)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  17. Microstructure evolution during homogenization of Al–Mn–Fe–Si alloys: Modeling and experimental results

    International Nuclear Information System (INIS)

    Du, Q.; Poole, W.J.; Wells, M.A.; Parson, N.C.

    2013-01-01

    Microstructure evolution during the homogenization heat treatment of Al–Mn–Fe–Si, or AA3xxx, alloys has been investigated using a combination of modeling and experimental studies. The model is fully coupled to CALculation PHAse Diagram (CALPHAD) software and has explicitly taken into account the two different length scales for diffusion encountered in modeling the homogenization process. The model is able to predict the evolution of all the important microstructural features during homogenization, including the inhomogeneous spatial distribution of dispersoids and alloying elements in solution, the dispersoid number density and the size distribution, and the type and fraction of intergranular constituent particles. Experiments were conducted using four direct chill (DC) cast AA3xxx alloys subjected to various homogenization treatments. The resulting microstructures were then characterized using a range of characterization techniques, including optical and electron microscopy, electron micro probe analysis, field emission gun scanning electron microscopy, and electrical resistivity measurements. The model predictions have been compared with the experimental measurements to validate the model. Further, it has been demonstrated that the validated model is able to predict the effects of alloying elements (e.g. Si and Mn) on microstructure evolution. It is concluded that the model provides a time and cost effective tool in optimizing and designing industrial AA3xxx alloy chemistries and homogenization heat treatments

  18. Modelling microstructural evolution under irradiation

    International Nuclear Information System (INIS)

    Tikare, V.

    2015-01-01

    Microstructural evolution of materials under irradiation is characterised by some unique features that are not typically present in other application environments. While much understanding has been achieved by experimental studies, the ability to model this microstructural evolution for complex materials states and environmental conditions not only enhances understanding, it also enables prediction of materials behaviour under conditions that are difficult to duplicate experimentally. Furthermore, reliable models enable designing materials for improved engineering performance for their respective applications. Thus, development and application of mesoscale microstructural model are important for advancing nuclear materials technologies. In this chapter, the application of the Potts model to nuclear materials will be reviewed and demonstrated, as an example of microstructural evolution processes. (author)

  19. ROSMOD: A Toolsuite for Modeling, Generating, Deploying, and Managing Distributed Real-time Component-based Software using ROS

    Directory of Open Access Journals (Sweden)

    Pranav Srinivas Kumar

    2016-09-01

    Full Text Available This paper presents the Robot Operating System Model-driven development tool suite, (ROSMOD an integrated development environment for rapid prototyping component-based software for the Robot Operating System (ROS middleware. ROSMOD is well suited for the design, development and deployment of large-scale distributed applications on embedded devices. We present the various features of ROSMOD including the modeling language, the graphical user interface, code generators, and deployment infrastructure. We demonstrate the utility of this tool with a real-world case study: an Autonomous Ground Support Equipment (AGSE robot that was designed and prototyped using ROSMOD for the NASA Student Launch competition, 2014–2015.

  20. A mechano-biological model of multi-tissue evolution in bone

    Science.gov (United States)

    Frame, Jamie; Rohan, Pierre-Yves; Corté, Laurent; Allena, Rachele

    2017-12-01

    Successfully simulating tissue evolution in bone is of significant importance in predicting various biological processes such as bone remodeling, fracture healing and osseointegration of implants. Each of these processes involves in different ways the permanent or transient formation of different tissue types, namely bone, cartilage and fibrous tissues. The tissue evolution in specific circumstances such as bone remodeling and fracturing healing is currently able to be modeled. Nevertheless, it remains challenging to predict which tissue types and organization can develop without any a priori assumptions. In particular, the role of mechano-biological coupling in this selective tissue evolution has not been clearly elucidated. In this work, a multi-tissue model has been created which simultaneously describes the evolution of bone, cartilage and fibrous tissues. The coupling of the biological and mechanical factors involved in tissue formation has been modeled by defining two different tissue states: an immature state corresponding to the early stages of tissue growth and representing cell clusters in a weakly neo-formed Extra Cellular Matrix (ECM), and a mature state corresponding to well-formed connective tissues. This has allowed for the cellular processes of migration, proliferation and apoptosis to be described simultaneously with the changing ECM properties through strain driven diffusion, growth, maturation and resorption terms. A series of finite element simulations were carried out on idealized cantilever bending geometries. Starting from a tissue composition replicating a mid-diaphysis section of a long bone, a steady-state tissue formation was reached over a statically loaded period of 10,000 h (60 weeks). The results demonstrated that bone formation occurred in regions which are optimally physiologically strained. In two additional 1000 h bending simulations both cartilaginous and fibrous tissues were shown to form under specific geometrical and loading

  1. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  2. Intelligent Data-Driven Reverse Engineering of Software Design Patterns

    OpenAIRE

    Alhusain, Sultan

    2016-01-01

    Recognising implemented instances of Design Patterns (DPs) in software design discloses and recovers a wealth of information about the intention of the original designers and the rationale for their design decisions. Because it is often the case that the documentation available for software systems, if any, is poor and/or obsolete, recovering such information can be of great help and importance for maintenance tasks. However, since DPs are abstractly and vaguely defined, a set of software cla...

  3. Process model for building quality software on internet time ...

    African Journals Online (AJOL)

    The competitive nature of the software construction market and the inherently exhilarating nature of software itself have hinged the success of any software development project on four major pillars: time to market, product quality, innovation and documentation. Unfortunately, however, existing software development models ...

  4. A physically based 3-D model of ice cliff evolution over debris-covered glaciers

    Science.gov (United States)

    Buri, Pascal; Miles, Evan S.; Steiner, Jakob F.; Immerzeel, Walter W.; Wagnon, Patrick; Pellicciotti, Francesca

    2016-12-01

    We use high-resolution digital elevation models (DEMs) from unmanned aerial vehicle (UAV) surveys to document the evolution of four ice cliffs on the debris-covered tongue of Lirung Glacier, Nepal, over one ablation season. Observations show that out of four cliffs, three different patterns of evolution emerge: (i) reclining cliffs that flatten during the ablation season; (ii) stable cliffs that maintain a self-similar geometry; and (iii) growing cliffs, expanding laterally. We use the insights from this unique data set to develop a 3-D model of cliff backwasting and evolution that is validated against observations and an independent data set of volume losses. The model includes ablation at the cliff surface driven by energy exchange with the atmosphere, reburial of cliff cells by surrounding debris, and the effect of adjacent ponds. The cliff geometry is updated monthly to account for the modifications induced by each of those processes. Model results indicate that a major factor affecting the survival of steep cliffs is the coupling with ponded water at its base, which prevents progressive flattening and possible disappearance of a cliff. The radial growth observed at one cliff is explained by higher receipts of longwave and shortwave radiation, calculated taking into account atmospheric fluxes, shading, and the emission of longwave radiation from debris surfaces. The model is a clear step forward compared to existing static approaches that calculate atmospheric melt over an invariant cliff geometry and can be used for long-term simulations of cliff evolution and to test existing hypotheses about cliffs' survival.

  5. Early experiences building a software quality prediction model

    Science.gov (United States)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  6. CFD modeling of space-time evolution of fast pyrolysis products in a bench-scale fluidized-bed reactor

    International Nuclear Information System (INIS)

    Boateng, A.A.; Mtui, P.L.

    2012-01-01

    A model for the evolution of pyrolysis products in a fluidized bed has been developed. In this study the unsteady constitutive transport equations for inert gas flow and decomposition kinetics were modeled using the commercial computational fluid dynamics (CFD) software FLUENT-12. The Eulerarian-Eulerian multiphase model system described herein is a fluidized bed of sand externally heated to a predetermined temperature prior to introduction of agricultural biomass. We predict the spontaneous emergence of pyrolysis vapors, char and non-condensable (permanent) gases and confirm the observation that the kinetics are fast and that bio-oil vapor evolution is accomplished in a few seconds, and occupying two-thirds of the spatial volume of the reactor as widely reported in the open literature. The model could be advantageous in the virtual design of fast pyrolysis reactors and their optimization to meet economic scales required for distributed or satellite units. - Highlights: ► We model the evolution of pyrolysis products in a fluidized bed via CFD. ► We predict the spontaneous emergence of pyrolysis products. ► We confirm the experimental observation that the kinetics are fast. ► And that bio-oil vapor evolution is accomplished in a few seconds. ► The model is advantageous in the virtual design of fast pyrolysis reactors.

  7. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  8. 4th International Conference in Software Engineering for Defence Applications

    CERN Document Server

    Sillitti, Alberto; Succi, Giancarlo; Messina, Angelo

    2016-01-01

    This book presents high-quality original contributions on new software engineering models, approaches, methods, and tools and their evaluation in the context of defence and security applications. In addition, important business and economic aspects are discussed, with a particular focus on cost/benefit analysis, new business models, organizational evolution, and business intelligence systems. The contents are based on presentations delivered at SEDA 2015, the 4th International Conference in Software Engineering for Defence Applications, which was held in Rome, Italy, in May 2015. This conference series represents a targeted response to the growing need for research that reports and debates the practical implications of software engineering within the defence environment and also for software performance evaluation in real settings through controlled experiments as well as case and field studies. The book will appeal to all with an interest in modeling, managing, and implementing defence-related software devel...

  9. A Model-Driven Development Method for Management Information Systems

    Science.gov (United States)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  10. Software design space exploration for exascale combustion co-design

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Cy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Unat, Didem [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lijewski, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Zhang, Weiqun [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bell, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Shalf, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-09-26

    The design of hardware for next-generation exascale computing systems will require a deep understanding of how software optimizations impact hardware design trade-offs. In order to characterize how co-tuning hardware and software parameters affects the performance of combustion simulation codes, we created ExaSAT, a compiler-driven static analysis and performance modeling framework. Our framework can evaluate hundreds of hardware/software configurations in seconds, providing an essential speed advantage over simulators and dynamic analysis techniques during the co-design process. Our analytic performance model shows that advanced code transformations, such as cache blocking and loop fusion, can have a significant impact on choices for cache and memory architecture. Our modeling helped us identify tuned configurations that achieve a 90% reduction in memory traffic, which could significantly improve performance and reduce energy consumption. These techniques will also be useful for the development of advanced programming models and runtimes, which must reason about these optimizations to deliver better performance and energy efficiency.

  11. CEREF: A hybrid data-driven model for forecasting annual streamflow from a socio-hydrological system

    Science.gov (United States)

    Zhang, Hongbo; Singh, Vijay P.; Wang, Bin; Yu, Yinghao

    2016-09-01

    -based forecasting, and results showed that removing high-frequency component is an effective measure to improve forecasting precision and is suggested for use with the CEREF model for better performance. Finally, the study concluded that the CEREF model can be used to forecast non-stationary annual streamflow change as a co-evolution of hydrologic and social systems with better accuracy. Also, the modification about removing high-frequency can further improve the performance of the CEREF model. It should be noted that the CEREF model is beneficial for data-driven hydrologic forecasting in complex socio-hydrologic systems, and as a simple data-driven socio-hydrologic forecasting model, deserves more attention.

  12. Model-Driven Development of Context-Aware Services

    NARCIS (Netherlands)

    Andrade Almeida, João; Iacob, Maria Eugenia; Jonkers, Henk; Quartel, Dick; Eliassen, Frank; Montresor, Alberto

    2006-01-01

    In this paper, we define a model-driven design trajectory for context-aware services consisting of three levels of models with different degrees of abstraction and platform independence. The models at the highest level of platform independence describe the behaviour of a context-aware service and

  13. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  14. Software process improvement in the NASA software engineering laboratory

    Science.gov (United States)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  15. Software to model AXAF-I image quality

    Science.gov (United States)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  16. Neutron star evolutions using tabulated equations of state with a new execution model

    Science.gov (United States)

    Anderson, Matthew; Kaiser, Hartmut; Neilsen, David; Sterling, Thomas

    2012-03-01

    The addition of nuclear and neutrino physics to general relativistic fluid codes allows for a more realistic description of hot nuclear matter in neutron star and black hole systems. This additional microphysics requires that each processor have access to large tables of data, such as equations of state, and in large simulations the memory required to store these tables locally can become excessive unless an alternative execution model is used. In this talk we present neutron star evolution results obtained using a message driven multi-threaded execution model known as ParalleX as an alternative to using a hybrid MPI-OpenMP approach. ParalleX provides the user a new way of computation based on message-driven flow control coordinated by lightweight synchronization elements which improves scalability and simplifies code development. We present the spectrum of radial pulsation frequencies for a neutron star with the Shen equation of state using the ParalleX execution model. We present performance results for an open source, distributed, nonblocking ParalleX-based tabulated equation of state component capable of handling tables that may even be too large to read into the memory of a single node.

  17. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  18. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  19. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  20. Bending of Euler-Bernoulli nanobeams based on the strain-driven and stress-driven nonlocal integral models: a numerical approach

    Science.gov (United States)

    Oskouie, M. Faraji; Ansari, R.; Rouhi, H.

    2018-04-01

    Eringen's nonlocal elasticity theory is extensively employed for the analysis of nanostructures because it is able to capture nanoscale effects. Previous studies have revealed that using the differential form of the strain-driven version of this theory leads to paradoxical results in some cases, such as bending analysis of cantilevers, and recourse must be made to the integral version. In this article, a novel numerical approach is developed for the bending analysis of Euler-Bernoulli nanobeams in the context of strain- and stress-driven integral nonlocal models. This numerical approach is proposed for the direct solution to bypass the difficulties related to converting the integral governing equation into a differential equation. First, the governing equation is derived based on both strain-driven and stress-driven nonlocal models by means of the minimum total potential energy. Also, in each case, the governing equation is obtained in both strong and weak forms. To solve numerically the derived equations, matrix differential and integral operators are constructed based upon the finite difference technique and trapezoidal integration rule. It is shown that the proposed numerical approach can be efficiently applied to the strain-driven nonlocal model with the aim of resolving the mentioned paradoxes. Also, it is able to solve the problem based on the strain-driven model without inconsistencies of the application of this model that are reported in the literature.

  1. A computational systems biology software platform for multiscale modeling and simulation: Integrating whole-body physiology, disease biology, and molecular reaction networks

    Directory of Open Access Journals (Sweden)

    Thomas eEissing

    2011-02-01

    Full Text Available Today, in silico studies and trial simulations already complement experimental approaches in pharmaceutical R&D and have become indispensable tools for decision making and communication with regulatory agencies. While biology is multi-scale by nature, project work and software tools usually focus on isolated aspects of drug action, such as pharmacokinetics at the organism scale or pharmacodynamic interaction on the molecular level. We present a modeling and simulation software platform consisting of PK-Sim® and MoBi® capable of building and simulating models that integrate across biological scales. A prototypical multiscale model for the progression of a pancreatic tumor and its response to pharmacotherapy is constructed and virtual patients are treated with a prodrug activated by hepatic metabolization. Tumor growth is driven by signal transduction leading to cell cycle transition and proliferation. Free tumor concentrations of the active metabolite inhibit Raf kinase in the signaling cascade and thereby cell cycle progression. In a virtual clinical study, the individual therapeutic outcome of the chemotherapeutic intervention is simulated for a large population with heterogeneous genomic background. Thereby, the platform allows efficient model building and integration of biological knowledge and prior data from all biological scales. Experimental in vitro model systems can be linked with observations in animal experiments and clinical trials. The interplay between patients, diseases, and drugs and topics with high clinical relevance such as the role of pharmacogenomics, drug-drug or drug-metabolite interactions can be addressed using this mechanistic, insight driven multiscale modeling approach.

  2. Test-Driven Development as an Innovation Value Chain

    Directory of Open Access Journals (Sweden)

    Ana Paula Ress

    2013-04-01

    Full Text Available For all companies that consider their Information Technology Department to be of strategic value, it is important to incorporate an innovation value chain into their software development lifecycles to enhance their teams' performance. One model is TDD (Test-Driven Development, which is a process that detects failures and improves the productivity and quality of the team’s work. Data were collected from a Financial Company with 3,500 employees to demonstrate that software projects that require more than 4,000 hours of development benefit from TDD if a clear knowledge conversion step occurs between the client and the developers.

  3. Software Design Modelling with Functional Petri Nets | Bakpo ...

    African Journals Online (AJOL)

    Software Design Modelling with Functional Petri Nets. ... of structured programs and a FPN Software prototype proposed for the conventional programming construct: if-then-else statement. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  4. Graph Based Verification of Software Evolution Requirements

    NARCIS (Netherlands)

    Ciraci, S.

    2009-01-01

    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the

  5. Developing Project Duration Models in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Pierre Bourque; Serge Oligny; Alain Abran; Bertrand Fournier

    2007-01-01

    Based on the empirical analysis of data contained in the International Software Benchmarking Standards Group(ISBSG) repository, this paper presents software engineering project duration models based on project effort. Duration models are built for the entire dataset and for subsets of projects developed for personal computer, mid-range and mainframeplatforms. Duration models are also constructed for projects requiring fewer than 400 person-hours of effort and for projectsre quiring more than 400 person-hours of effort. The usefulness of adding the maximum number of assigned resources as asecond independent variable to explain duration is also analyzed. The opportunity to build duration models directly fromproject functional size in function points is investigated as well.

  6. A distributed snow-evolution modeling system (SnowModel)

    Science.gov (United States)

    Glen E. Liston; Kelly. Elder

    2006-01-01

    SnowModel is a spatially distributed snow-evolution modeling system designed for application in landscapes, climates, and conditions where snow occurs. It is an aggregation of four submodels: MicroMet defines meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowPack simulates snow depth and water-equivalent evolution, and SnowTran-3D...

  7. Data-driven modeling of nano-nose gas sensor arrays

    DEFF Research Database (Denmark)

    Alstrøm, Tommy Sonne; Larsen, Jan; Nielsen, Claus Højgård

    2010-01-01

    We present a data-driven approach to classification of Quartz Crystal Microbalance (QCM) sensor data. The sensor is a nano-nose gas sensor that detects concentrations of analytes down to ppm levels using plasma polymorized coatings. Each sensor experiment takes approximately one hour hence...... the number of available training data is limited. We suggest a data-driven classification model which work from few examples. The paper compares a number of data-driven classification and quantification schemes able to detect the gas and the concentration level. The data-driven approaches are based on state...

  8. Data–driven modeling of nano-nose gas sensor arrays

    DEFF Research Database (Denmark)

    Alstrøm, Tommy Sonne; Larsen, Jan; Nielsen, Claus Højgård

    2010-01-01

    We present a data-driven approach to classification of Quartz Crystal Microbalance (QCM) sensor data. The sensor is a nano-nose gas sensor that detects concentrations of analytes down to ppm levels using plasma polymorized coatings. Each sensor experiment takes approximately one hour hence...... the number of available training data is limited. We suggest a data-driven classification model which work from few examples. The paper compares a number of data-driven classification and quantification schemes able to detect the gas and the concentration level. The data-driven approaches are based on state...

  9. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  10. Change impact analysis for software product lines

    Directory of Open Access Journals (Sweden)

    Jihen Maâzoun

    2016-10-01

    Full Text Available A software product line (SPL represents a family of products in a given application domain. Each SPL is constructed to provide for the derivation of new products by covering a wide range of features in its domain. Nevertheless, over time, some domain features may become obsolete with the apparition of new features while others may become refined. Accordingly, the SPL must be maintained to account for the domain evolution. Such evolution requires a means for managing the impact of changes on the SPL models, including the feature model and design. This paper presents an automated method that analyzes feature model evolution, traces their impact on the SPL design, and offers a set of recommendations to ensure the consistency of both models. The proposed method defines a set of new metrics adapted to SPL evolution to identify the effort needed to maintain the SPL models consistently and with a quality as good as the original models. The method and its tool are illustrated through an example of an SPL in the Text Editing domain. In addition, they are experimentally evaluated in terms of both the quality of the maintained SPL models and the precision of the impact change management.

  11. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  12. Management models in the NZ software industry

    Directory of Open Access Journals (Sweden)

    Holger Spill

    Full Text Available This research interviewed eight innovative New Zealand software companies to find out how they manage new product development. It looked at how management used standard techniques of software development to manage product uncertainty through the theoretical lens of the Cyclic Innovation Model. The study found that while there is considerable variation, the management of innovation was largely determined by the level of complexity. Organizations with complex innovative software products had a more iterative software development style, more flexible internal processes and swifter decision-making. Organizations with less complexity in their products tended to use more formal structured approaches. Overall complexity could be inferred with reference to four key factors within the development environment.

  13. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  14. Open Source Software Success Model for Iran: End-User Satisfaction Viewpoint

    Directory of Open Access Journals (Sweden)

    Ali Niknafs

    2012-03-01

    Full Text Available The open source software development is notable option for software companies. Recent years, many advantages of this software type are cause of move to that in Iran. National security and international restrictions problems and also software and services costs and more other problems intensified importance of use of this software. Users and their viewpoints are the critical success factor in the software plans. But there is not an appropriate model for open source software case in Iran. This research tried to develop a measuring open source software success model for Iran. By use of data gathered from open source users and online survey the model was tested. The results showed that components by positive effect on open source success were user satisfaction, open source community services quality, open source quality, copyright and security.

  15. Development of an Environment for Software Reliability Model Selection

    Science.gov (United States)

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  16. Extracting software static defect models using data mining

    Directory of Open Access Journals (Sweden)

    Ahmed H. Yousef

    2015-03-01

    Full Text Available Large software projects are subject to quality risks of having defective modules that will cause failures during the software execution. Several software repositories contain source code of large projects that are composed of many modules. These software repositories include data for the software metrics of these modules and the defective state of each module. In this paper, a data mining approach is used to show the attributes that predict the defective state of software modules. Software solution architecture is proposed to convert the extracted knowledge into data mining models that can be integrated with the current software project metrics and bugs data in order to enhance the prediction. The results show better prediction capabilities when all the algorithms are combined using weighted votes. When only one individual algorithm is used, Naïve Bayes algorithm has the best results, then the Neural Network and the Decision Trees algorithms.

  17. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  18. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  19. The software-cycle model for re-engineering and reuse

    Science.gov (United States)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  20. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice...

  1. Discrete Event Modeling and Simulation-Driven Engineering for the ATLAS Data Acquisition Network

    CERN Document Server

    Bonaventura, Matias Alejandro; The ATLAS collaboration; Castro, Rodrigo Daniel

    2016-01-01

    We present an iterative and incremental development methodology for simulation models in network engineering projects. Driven by the DEVS (Discrete Event Systems Specification) formal framework for modeling and simulation we assist network design, test, analysis and optimization processes. A practical application of the methodology is presented for a case study in the ATLAS particle physics detector, the largest scientific experiment built by man where scientists around the globe search for answers about the origins of the universe. The ATLAS data network convey real-time information produced by physics detectors as beams of particles collide. The produced sub-atomic evidences must be filtered and recorded for further offline scrutiny. Due to the criticality of the transported data, networks and applications undergo careful engineering processes with stringent quality of service requirements. A tight project schedule imposes time pressure on design decisions, while rapid technology evolution widens the palett...

  2. Alignment and prediction of cis-regulatory modules based on a probabilistic model of evolution.

    Directory of Open Access Journals (Sweden)

    Xin He

    2009-03-01

    Full Text Available Cross-species comparison has emerged as a powerful paradigm for predicting cis-regulatory modules (CRMs and understanding their evolution. The comparison requires reliable sequence alignment, which remains a challenging task for less conserved noncoding sequences. Furthermore, the existing models of DNA sequence evolution generally do not explicitly treat the special properties of CRM sequences. To address these limitations, we propose a model of CRM evolution that captures different modes of evolution of functional transcription factor binding sites (TFBSs and the background sequences. A particularly novel aspect of our work is a probabilistic model of gains and losses of TFBSs, a process being recognized as an important part of regulatory sequence evolution. We present a computational framework that uses this model to solve the problems of CRM alignment and prediction. Our alignment method is similar to existing methods of statistical alignment but uses the conserved binding sites to improve alignment. Our CRM prediction method deals with the inherent uncertainties of binding site annotations and sequence alignment in a probabilistic framework. In simulated as well as real data, we demonstrate that our program is able to improve both alignment and prediction of CRM sequences over several state-of-the-art methods. Finally, we used alignments produced by our program to study binding site conservation in genome-wide binding data of key transcription factors in the Drosophila blastoderm, with two intriguing results: (i the factor-bound sequences are under strong evolutionary constraints even if their neighboring genes are not expressed in the blastoderm and (ii binding sites in distal bound sequences (relative to transcription start sites tend to be more conserved than those in proximal regions. Our approach is implemented as software, EMMA (Evolutionary Model-based cis-regulatory Module Analysis, ready to be applied in a broad biological context.

  3. Towards the identification of the influence of SPI on the successful evolution of software SMEs

    OpenAIRE

    Clarke, Paul; O'Connor, Rory

    2010-01-01

    peer-reviewed Software development requires multi-stage processes in order to organise the software development effort. Each software development project should implement a development process that is appropriate to the project setting. Since business needs and technologies are subject to change, software process improvement (SPI) actions are required so as to harmonise the process with the emerging business and technology needs. SPI frameworks such as the Capability Maturity Model Integra...

  4. Semi-Empirical Models for Buoyancy-Driven Ventilation

    DEFF Research Database (Denmark)

    Terpager Andersen, Karl

    2015-01-01

    A literature study is presented on the theories and models dealing with buoyancy-driven ventilation in rooms. The models are categorised into four types according to how the physical process is conceived: column model, fan model, neutral plane model and pressure model. These models are analysed...... and compared with a reference model. Discrepancies and differences are shown, and the deviations are discussed. It is concluded that a reliable buoyancy model based solely on the fundamental flow equations is desirable....

  5. Usage of Modified Heuristic Model for Determination of Software Stability

    Directory of Open Access Journals (Sweden)

    Sergey Konstantinovich Marfenko

    2013-02-01

    Full Text Available The subject of this paper is analysis method for determining the stability of software against the attacks on its integrity. It is suggested to use the modified heuristic model of software reliability as mathematic basis of this method. This model is based on classic approach, but it takes into account impact levels of different software errors on system integrity. It allows to define critical characteristics of software: percentage of time in stable working, the possibility of failure.

  6. Tracer kinetic model-driven registration for dynamic contrast-enhanced MRI time-series data.

    Science.gov (United States)

    Buonaccorsi, Giovanni A; O'Connor, James P B; Caunce, Angela; Roberts, Caleb; Cheung, Sue; Watson, Yvonne; Davies, Karen; Hope, Lynn; Jackson, Alan; Jayson, Gordon C; Parker, Geoffrey J M

    2007-11-01

    Dynamic contrast-enhanced MRI (DCE-MRI) time series data are subject to unavoidable physiological motion during acquisition (e.g., due to breathing) and this motion causes significant errors when fitting tracer kinetic models to the data, particularly with voxel-by-voxel fitting approaches. Motion correction is problematic, as contrast enhancement introduces new features into postcontrast images and conventional registration similarity measures cannot fully account for the increased image information content. A methodology is presented for tracer kinetic model-driven registration that addresses these problems by explicitly including a model of contrast enhancement in the registration process. The iterative registration procedure is focused on a tumor volume of interest (VOI), employing a three-dimensional (3D) translational transformation that follows only tumor motion. The implementation accurately removes motion corruption in a DCE-MRI software phantom and it is able to reduce model fitting errors and improve localization in 3D parameter maps in patient data sets that were selected for significant motion problems. Sufficient improvement was observed in the modeling results to salvage clinical trial DCE-MRI data sets that would otherwise have to be rejected due to motion corruption. Copyright 2007 Wiley-Liss, Inc.

  7. A Distributed Snow Evolution Modeling System (SnowModel)

    Science.gov (United States)

    Liston, G. E.; Elder, K.

    2004-12-01

    A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.

  8. A bridge role metric model for nodes in software networks.

    Directory of Open Access Journals (Sweden)

    Bo Li

    Full Text Available A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  9. SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a

  10. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent......, or by ignoring the causes. This substitutes manual reviews to some extent. The concepts, implemented in a tool, have been validated with design patterns, refactorings, and domain level tests that comprise a replay of a real project. This proves the applicability of the solution to realistic examples...

  11. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  12. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies

    International Nuclear Information System (INIS)

    Barnes, Samuel R.; Ng, Thomas S. C.; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V.; Jacobs, Russell E.

    2015-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at (https://github.com/petmri/ROCKETSHIP)

  13. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  14. Functional Domain Driven Design

    OpenAIRE

    Herrera Guzmán, Sergio

    2016-01-01

    Las tecnologías están en constante expansión y evolución, diseñando nuevas técnicas para cumplir con su fin. En el desarrollo de software, las herramientas y pautas para la elaboración de productos software constituyen una pieza en constante evolución, necesarias para la toma de decisiones sobre los proyectos a realizar. Uno de los arquetipos para el desarrollo de software es el denominado Domain Driven Design, donde es importante conocer ampliamente el negocio que se desea modelar en form...

  15. The ModelCC Model-Driven Parser Generator

    Directory of Open Access Journals (Sweden)

    Fernando Berzal

    2015-01-01

    Full Text Available Syntax-directed translation tools require the specification of a language by means of a formal grammar. This grammar must conform to the specific requirements of the parser generator to be used. This grammar is then annotated with semantic actions for the resulting system to perform its desired function. In this paper, we introduce ModelCC, a model-based parser generator that decouples language specification from language processing, avoiding some of the problems caused by grammar-driven parser generators. ModelCC receives a conceptual model as input, along with constraints that annotate it. It is then able to create a parser for the desired textual syntax and the generated parser fully automates the instantiation of the language conceptual model. ModelCC also includes a reference resolution mechanism so that ModelCC is able to instantiate abstract syntax graphs, rather than mere abstract syntax trees.

  16. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  17. Understanding and Modeling the Evolution of Critical Points under Gaussian Blurring

    NARCIS (Netherlands)

    Kuijper, A.; Florack, L.M.J.; Heyden, A.; Sparr, G.; Nielsen, M.; Johansen, P.

    2002-01-01

    In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the influence of parameter-driven blurring. During this evolution two different types of special points are encountered, the so-called scale space saddles and the

  18. The Event-Driven Software Library for YARP—With Algorithms and iCub Applications

    Directory of Open Access Journals (Sweden)

    Arren Glover

    2018-01-01

    Full Text Available Event-driven (ED cameras are an emerging technology that sample the visual signal based on changes in the signal magnitude, rather than at a fixed-rate over time. The change in paradigm results in a camera with a lower latency, that uses less power, has reduced bandwidth, and higher dynamic range. Such cameras offer many potential advantages for on-line, autonomous, robots; however, the sensor data do not directly integrate with current “image-based” frameworks and software libraries. The iCub robot uses Yet Another Robot Platform (YARP as middleware to provide modular processing and connectivity to sensors and actuators. This paper introduces a library that incorporates an event-based framework into the YARP architecture, allowing event cameras to be used with the iCub (and other YARP-based robots. We describe the philosophy and methods for structuring events to facilitate processing, while maintaining low-latency and real-time operation. We also describe several processing modules made available open-source, and three example demonstrations that can be run on the neuromorphic iCub.

  19. The Software Architecture of Global Climate Models

    Science.gov (United States)

    Alexander, K. A.; Easterbrook, S. M.

    2011-12-01

    It has become common to compare and contrast the output of multiple global climate models (GCMs), such as in the Climate Model Intercomparison Project Phase 5 (CMIP5). However, intercomparisons of the software architecture of GCMs are almost nonexistent. In this qualitative study of seven GCMs from Canada, the United States, and Europe, we attempt to fill this gap in research. We describe the various representations of the climate system as computer programs, and account for architectural differences between models. Most GCMs now practice component-based software engineering, where Earth system components (such as the atmosphere or land surface) are present as highly encapsulated sub-models. This architecture facilitates a mix-and-match approach to climate modelling that allows for convenient sharing of model components between institutions, but it also leads to difficulty when choosing where to draw the lines between systems that are not encapsulated in the real world, such as sea ice. We also examine different styles of couplers in GCMs, which manage interaction and data flow between components. Finally, we pay particular attention to the varying levels of complexity in GCMs, both between and within models. Many GCMs have some components that are significantly more complex than others, a phenomenon which can be explained by the respective institution's research goals as well as the origin of the model components. In conclusion, although some features of software architecture have been adopted by every GCM we examined, other features show a wide range of different design choices and strategies. These architectural differences may provide new insights into variability and spread between models.

  20. Identifying and Modeling Dynamic Preference Evolution in Multipurpose Water Resources Systems

    Science.gov (United States)

    Mason, E.; Giuliani, M.; Castelletti, A.; Amigoni, F.

    2018-04-01

    Multipurpose water systems are usually operated on a tradeoff of conflicting operating objectives. Under steady state climatic and socioeconomic conditions, such tradeoff is supposed to represent a fair and/or efficient preference. Extreme variability in external forcing might affect water operators' risk aversion and force a change in her/his preference. Properly accounting for these shifts is key to any rigorous retrospective assessment of the operator's behaviors, and to build descriptive models for projecting the future system evolution. In this study, we explore how the selection of different preferences is linked to variations in the external forcing. We argue that preference selection evolves according to recent, extreme variations in system performance: underperforming in one of the objectives pushes the preference toward the harmed objective. To test this assumption, we developed a rational procedure to simulate the operator's preference selection. We map this selection onto a multilateral negotiation, where multiple virtual agents independently optimize different objectives. The agents periodically negotiate a compromise policy for the operation of the system. Agents' attitudes in each negotiation step are determined by the recent system performance measured by the specific objective they maximize. We then propose a numerical model of preference dynamics that implements a concept from cognitive psychology, the availability bias. We test our modeling framework on a synthetic lake operated for flood control and water supply. Results show that our model successfully captures the operator's preference selection and dynamic evolution driven by extreme wet and dry situations.

  1. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  2. From current-driven to neoclassically driven tearing modes.

    Science.gov (United States)

    Reimerdes, H; Sauter, O; Goodman, T; Pochelon, A

    2002-03-11

    In the TCV tokamak, the m/n = 2/1 island is observed in low-density discharges with central electron-cyclotron current drive. The evolution of its width has two distinct growth phases, one of which can be linked to a "conventional" tearing mode driven unstable by the current profile and the other to a neoclassical tearing mode driven by a perturbation of the bootstrap current. The TCV results provide the first clear observation of such a destabilization mechanism and reconcile the theory of conventional and neoclassical tearing modes, which differ only in the dominant driving term.

  3. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  4. Grammar Maturity Model

    NARCIS (Netherlands)

    Zaytsev, V.; Pierantonio, A.; Schätz, B.; Tamzalit, D.

    2014-01-01

    The evolution of a software language (whether modelled by a grammar or a schema or a metamodel) is not limited to development of new versions and dialects. An important dimension of a software language evolution is maturing in the sense of improving the quality of its definition. In this paper, we

  5. Model-driven user interfaces for bioinformatics data resources: regenerating the wheel as an alternative to reinventing it

    Directory of Open Access Journals (Sweden)

    Swainston Neil

    2006-12-01

    Full Text Available Abstract Background The proliferation of data repositories in bioinformatics has resulted in the development of numerous interfaces that allow scientists to browse, search and analyse the data that they contain. Interfaces typically support repository access by means of web pages, but other means are also used, such as desktop applications and command line tools. Interfaces often duplicate functionality amongst each other, and this implies that associated development activities are repeated in different laboratories. Interfaces developed by public laboratories are often created with limited developer resources. In such environments, reducing the time spent on creating user interfaces allows for a better deployment of resources for specialised tasks, such as data integration or analysis. Laboratories maintaining data resources are challenged to reconcile requirements for software that is reliable, functional and flexible with limitations on software development resources. Results This paper proposes a model-driven approach for the partial generation of user interfaces for searching and browsing bioinformatics data repositories. Inspired by the Model Driven Architecture (MDA of the Object Management Group (OMG, we have developed a system that generates interfaces designed for use with bioinformatics resources. This approach helps laboratory domain experts decrease the amount of time they have to spend dealing with the repetitive aspects of user interface development. As a result, the amount of time they can spend on gathering requirements and helping develop specialised features increases. The resulting system is known as Pierre, and has been validated through its application to use cases in the life sciences, including the PEDRoDB proteomics database and the e-Fungi data warehouse. Conclusion MDAs focus on generating software from models that describe aspects of service capabilities, and can be applied to support rapid development of repository

  6. Strategy evolution driven by switching probabilities in structured multi-agent systems

    Science.gov (United States)

    Zhang, Jianlei; Chen, Zengqiang; Li, Zhiqi

    2017-10-01

    Evolutionary mechanism driving the commonly seen cooperation among unrelated individuals is puzzling. Related models for evolutionary games on graphs traditionally assume that players imitate their successful neighbours with higher benefits. Notably, an implicit assumption here is that players are always able to acquire the required pay-off information. To relax this restrictive assumption, a contact-based model has been proposed, where switching probabilities between strategies drive the strategy evolution. However, the explicit and quantified relation between a player's switching probability for her strategies and the number of her neighbours remains unknown. This is especially a key point in heterogeneously structured system, where players may differ in the numbers of their neighbours. Focusing on this, here we present an augmented model by introducing an attenuation coefficient and evaluate its influence on the evolution dynamics. Results show that the individual influence on others is negatively correlated with the contact numbers specified by the network topologies. Results further provide the conditions under which the coexisting strategies can be calculated analytically.

  7. Managing business compliance using model-driven security management

    Science.gov (United States)

    Lang, Ulrich; Schreiner, Rudolf

    Compliance with regulatory and governance standards is rapidly becoming one of the hot topics of information security today. This is because, especially with regulatory compliance, both business and government have to expect large financial and reputational losses if compliance cannot be ensured and demonstrated. One major difficulty of implementing such regulations is caused the fact that they are captured at a high level of abstraction that is business-centric and not IT centric. This means that the abstract intent needs to be translated in a trustworthy, traceable way into compliance and security policies that the IT security infrastructure can enforce. Carrying out this mapping process manually is time consuming, maintenance-intensive, costly, and error-prone. Compliance monitoring is also critical in order to be able to demonstrate compliance at any given point in time. The problem is further complicated because of the need for business-driven IT agility, where IT policies and enforcement can change frequently, e.g. Business Process Modelling (BPM) driven Service Oriented Architecture (SOA). Model Driven Security (MDS) is an innovative technology approach that can solve these problems as an extension of identity and access management (IAM) and authorization management (also called entitlement management). In this paper we will illustrate the theory behind Model Driven Security for compliance, provide an improved and extended architecture, as well as a case study in the healthcare industry using our OpenPMF 2.0 technology.

  8. Geophysical monitoring and reactive transport modeling of ureolytically-driven calcium carbonate precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Y.; Ajo-Franklin, J.B.; Spycher, N.; Hubbard, S.S.; Zhang, G.; Williams, K.H.; Taylor, J.; Fujita, Y.; Smith, R.

    2011-07-15

    Ureolytically-driven calcium carbonate precipitation is the basis for a promising in-situ remediation method for sequestration of divalent radionuclide and trace metal ions. It has also been proposed for use in geotechnical engineering for soil strengthening applications. Monitoring the occurrence, spatial distribution, and temporal evolution of calcium carbonate precipitation in the subsurface is critical for evaluating the performance of this technology and for developing the predictive models needed for engineering application. In this study, we conducted laboratory column experiments using natural sediment and groundwater to evaluate the utility of geophysical (complex resistivity and seismic) sensing methods, dynamic synchrotron x-ray computed tomography (micro-CT), and reactive transport modeling for tracking ureolytically-driven calcium carbonate precipitation processes under site relevant conditions. Reactive transport modeling with TOUGHREACT successfully simulated the changes of the major chemical components during urea hydrolysis. Even at the relatively low level of urea hydrolysis observed in the experiments, the simulations predicted an enhanced calcium carbonate precipitation rate that was 3-4 times greater than the baseline level. Reactive transport modeling results, geophysical monitoring data and micro-CT imaging correlated well with reaction processes validated by geochemical data. In particular, increases in ionic strength of the pore fluid during urea hydrolysis predicted by geochemical modeling were successfully captured by electrical conductivity measurements and confirmed by geochemical data. The low level of urea hydrolysis and calcium carbonate precipitation suggested by the model and geochemical data was corroborated by minor changes in seismic P-wave velocity measurements and micro-CT imaging; the latter provided direct evidence of sparsely distributed calcium carbonate precipitation. Ion exchange processes promoted through NH{sub 4}{sup

  9. Increasing the reliability of ecological models using modern software engineering techniques

    Science.gov (United States)

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  10. A solvable two-species catalysis-driven aggregation model

    CERN Document Server

    Ke Jian Hong

    2003-01-01

    We study the kinetics of a two-species catalysis-driven aggregation system, in which an irreversible aggregation between any two clusters of one species occurs only with the catalytic action of another species. By means of a generalized mean-field rate equation, we obtain the asymptotic solutions of the cluster mass distributions in a simple process with a constant rate kernel. For the case without any consumption of the catalyst, the cluster mass distribution of either species always approaches a conventional scaling law. However, the evolution behaviour of the system in the case with catalyst consumption is complicated and depends crucially on the relative data of the initial concentrations of the two species.

  11. The advanced software development workstation project

    Science.gov (United States)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  12. QSO evolution in the interaction model

    International Nuclear Information System (INIS)

    De Robertis, M.

    1985-01-01

    QSO evolution is investigated according to the interaction hypothesis described most recently by Stockton (1982), in which activity results from an interaction between two galaxies resulting in the transfer of gas onto a supermassive black hole (SBH) at the center of at least one participant. Explicit models presented here for interactions in cluster environments show that a peak QSO population can be formed in this way at zroughly-equal2--3, with little activity prior to this epoch. Calculated space densities match those inferred from observations for this epoch. Substantial density evolution is expected in such models, since, after virialization, conditions in the cores of rich clusters lead to the depletion of gas-rich systems through ram-pressure stripping. Density evolution parameters of 6--12 are easily accounted for. At smaller redshifts, however, QSOs should be found primarily in poor clusters or groups. Probability estimates provided by this model are consistent with local estimates for the observed number of QSOs per interaction. Significant luminosity-dependent evolution might also be expected in these models. It is suggested that the mean SBH mass increases with lookback time, leading to a statistical brightening with redshift. Undoubtedly, both forms of evolution contribute to the overall QSO luminosity function

  13. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  14. Radiation-driven winds in x-ray binaries

    International Nuclear Information System (INIS)

    Friend, D.B.; Castor, J.I.

    1982-01-01

    We discuss the properties of a radiation-driven stellar wind in an X-ray binary system. The Castor, Abbott, Klein line-driven wind model is used, but the effects of the compact companion (gravity and continuum radiation pressure) and the centrifugal force due to orbital motion are included. These forces destroy the spherical symmetry of the wind and can make the mass loss and accretion strong functions of the size of the primary relative to its critical potential lobe. We in most systems the wind alone could power the X-ray emission. It also appears that, in the evolution of these systems, there would be a continuous transition from wind accretion to critical potential lobe overflow. The model is also used to make a prediction about the nature of a suspected binary system which is not known to be an X-ray emitter

  15. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  16. Modeling shoreface profile evolution

    NARCIS (Netherlands)

    Stive, M.J.F.; De Vriend, H.J.

    1995-01-01

    Current knowledge of hydro-, sediment and morpho-dynamics in the shoreface environment is insufficient to undertake shoreface-profile evolution modelling on the basis of first physical principles. We propose a simple, panel-type model to map observed behaviour. The internal dynamics are determined

  17. Modelling shoreface profile evolution

    NARCIS (Netherlands)

    Stive, Marcel J.F.; de Vriend, Huib J.

    1995-01-01

    Current knowledge of hydro-, sediment and morpho-dynamics in the shoreface environment is insufficient to undertake shoreface-profile evolution modelling on the basis of first physical principles. We propose a simple, panel-type model to map observed behaviour. The internal dynamics are determined

  18. A historical dataset of software engineering conferences

    NARCIS (Netherlands)

    Vasilescu, B.N.; Serebrenik, A.; Mens, T.

    2013-01-01

    The Mining Software Repositories community typically focuses on data from software configuration management tools, mailing lists, and bug tracking repositories to uncover interesting and actionable information about the evolution of software systems. However, the techniques employed and the

  19. A Simulation Model for the Waterfall Software Development Life Cycle

    OpenAIRE

    Bassil, Youssef

    2012-01-01

    Software development life cycle or SDLC for short is a methodology for designing, building, and maintaining information and industrial systems. So far, there exist many SDLC models, one of which is the Waterfall model which comprises five phases to be completed sequentially in order to develop a software solution. However, SDLC of software systems has always encountered problems and limitations that resulted in significant budget overruns, late or suspended deliveries, and dissatisfied client...

  20. The Astringency of the GP Algorithm for Forecasting Software Failure Data Series

    Directory of Open Access Journals (Sweden)

    Yong-qiang Zhang

    2007-05-01

    Full Text Available The forecasting of software failure data series by Genetic Programming (GP can be realized without any assumptions before modeling. This discovery has transformed traditional statistical modeling methods as well as improved consistency for model applicability. The individuals' different characteristics during the evolution of generations, which are randomly changeable, are treated as Markov random processes. This paper also proposes that a GP algorithm with "optimal individuals reserved strategy" is the best solution to this problem, and therefore the adaptive individuals finally will be evolved. This will allow practical applications in software reliability modeling analysis and forecasting for failure behaviors. Moreover it can verify the feasibility and availability of the GP algorithm, which is applied to software failure data series forecasting on a theoretical basis. The results show that the GP algorithm is the best solution for software failure behaviors in a variety of disciplines.

  1. Towards a distributed control system for software defined wireless sensor networks

    CSIR Research Space (South Africa)

    Kobo, Hlabishi I

    2017-10-01

    Full Text Available on the network device. The coupling stifles innovation and evolution because the network often becomes rigid. Software Defined Wireless Sensor Networks (SDWSN) is also an emerging network paradigm that infuses the SDN model into Wireless Sensor Networks (WSNs...

  2. Introduction to Lean Canvas Transformation Models and Metrics in Software Testing

    Directory of Open Access Journals (Sweden)

    Nidagundi Padmaraj

    2016-05-01

    Full Text Available Software plays a key role nowadays in all fields, from simple up to cutting-edge technologies and most of technology devices now work on software. Software development verification and validation have become very important to produce the high quality software according to business stakeholder requirements. Different software development methodologies have given a new dimension for software testing. In traditional waterfall software development software testing has approached the end point and begins with resource planning, a test plan is designed and test criteria are defined for acceptance testing. In this process most of test plan is well documented and it leads towards the time-consuming processes. For the modern software development methodology such as agile where long test processes and documentations are not followed strictly due to small iteration of software development and testing, lean canvas transformation models can be a solution. This paper provides a new dimension to find out the possibilities of adopting the lean transformation models and metrics in the software test plan to simplify the test process for further use of these test metrics on canvas.

  3. The Legacy of Space Shuttle Flight Software

    Science.gov (United States)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  4. 4D modeling of salt-sediment interactions during diapir evolution

    Energy Technology Data Exchange (ETDEWEB)

    Callot, J.P.; Rondon, D.; Letouzey, J. [IFP, Rueil Malmaison (France); Krajewski, P. [Gaz de France-PEG, Lingen (Germany); Rigollet, C. [Gaz de France, St. Denis la Plaine (France)

    2007-09-13

    salt in Oman produce oil, but constitute a major exploration risk due to large technical difficulties of structural and seismic imagery, complexity in deciphering their evolutionary steps, and possible unexpected overpressures and of hydrocarbons. 4D analogue modelling of cylindrical and salt-like diapirs is performed to reproduce the evolution of internal sand layer during diapir growth. The growth and geometry of the salt structure is entirely controlled and only driven by the overburden deposition. The 3D internal geometry is reconstructed for different steps to show the progressive rise, tearing apart, and fall of the stringers pieces. Complex geometries are observed and compare well to natural examples picked on seismic imagery. It appears that stringers observed in the German salt basin may originate from cap rock pieces detaching from the diapir roof and drowning in the salt. (orig.)

  5. A dependability modeling of software under hardware faults digitized system in nuclear power plants

    International Nuclear Information System (INIS)

    Choi, Jong Gyun

    1996-02-01

    An analytic approach to the dependability evaluation of software in the operational phase is suggested in this work with special attention to the physical fault effects on the software dependability : The physical faults considered are memory faults and the dependability measure in question is the reliability. The model is based on the simple reliability theory and the graph theory with the path decomposition micro model. The model represents an application software with a graph consisting of nodes and arcs that probabilistic ally determine the flow from node to node. Through proper transformation of nodes and arcs, the graph can be reduced to a simple two-node graph and the software failure probability is derived from this graph. This model can be extended to the software system which consists of several complete modules without modification. The derived model is validated by the computer simulation, where the software is transformed to a probabilistic control flow graph. Simulation also shows a different viewpoint of software failure behavior. Using this model, we predict the reliability of an application software and a software system in a digitized system(ILS system) in the nuclear power plant and show the sensitivity of the software reliability to the major physical parameters which affect the software failure in the normal operation phase. The derived model is validated by the computer simulation, where the software is transformed to a probabilistic control flow graph. Simulation also shows a different viewpoint of software failure behavior. Using this model, we predict the reliability of an application software and a software system in a digitized system (ILS system) is the nuclear power plant and show the sensitivity of the software reliability to the major physical parameters which affect the software failure in the normal operation phase. This modeling method is particularly attractive for medium size programs such as software used in digitized systems of

  6. Statistical Data Processing with R – Metadata Driven Approach

    Directory of Open Access Journals (Sweden)

    Rudi SELJAK

    2016-06-01

    Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.

  7. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  8. Data mining, knowledge discovery and data-driven modelling

    NARCIS (Netherlands)

    Solomatine, D.P.; Velickov, S.; Bhattacharya, B.; Van der Wal, B.

    2003-01-01

    The project was aimed at exploring the possibilities of a new paradigm in modelling - data-driven modelling, often referred as "data mining". Several application areas were considered: sedimentation problems in the Port of Rotterdam, automatic soil classification on the basis of cone penetration

  9. Recombination-Driven Genome Evolution and Stability of Bacterial Species.

    Science.gov (United States)

    Dixit, Purushottam D; Pang, Tin Yau; Maslov, Sergei

    2017-09-01

    While bacteria divide clonally, horizontal gene transfer followed by homologous recombination is now recognized as an important contributor to their evolution. However, the details of how the competition between clonality and recombination shapes genome diversity remains poorly understood. Using a computational model, we find two principal regimes in bacterial evolution and identify two composite parameters that dictate the evolutionary fate of bacterial species. In the divergent regime, characterized by either a low recombination frequency or strict barriers to recombination, cohesion due to recombination is not sufficient to overcome the mutational drift. As a consequence, the divergence between pairs of genomes in the population steadily increases in the course of their evolution. The species lacks genetic coherence with sexually isolated clonal subpopulations continuously formed and dissolved. In contrast, in the metastable regime, characterized by a high recombination frequency combined with low barriers to recombination, genomes continuously recombine with the rest of the population. The population remains genetically cohesive and temporally stable. Notably, the transition between these two regimes can be affected by relatively small changes in evolutionary parameters. Using the Multi Locus Sequence Typing (MLST) data, we classify a number of bacterial species to be either the divergent or the metastable type. Generalizations of our framework to include selection, ecologically structured populations, and horizontal gene transfer of nonhomologous regions are discussed as well. Copyright © 2017 by the Genetics Society of America.

  10. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  11. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  12. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  13. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  14. Menthor Editor: An Ontology-Driven Conceptual Modeling Platform

    NARCIS (Netherlands)

    Moreira, João Luiz; Sales, Tiago Prince; Guerson, John; Braga, Bernardo F.B; Brasileiro, Freddy; Sobral, Vinicius

    2016-01-01

    The lack of well-founded constructs in ontology tools can lead to the construction of non-intended models. In this demonstration we present the Menthor Editor, an ontology-driven conceptual modelling platform which incorporates the theories of the Unified Foundational Ontology (UFO). We illustrate

  15. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  16. Conceptual Software Reliability Prediction Models for Nuclear Power Plant Safety Systems

    International Nuclear Information System (INIS)

    Johnson, G.; Lawrence, D.; Yu, H.

    2000-01-01

    The objective of this project is to develop a method to predict the potential reliability of software to be used in a digital system instrumentation and control system. The reliability prediction is to make use of existing measures of software reliability such as those described in IEEE Std 982 and 982.2. This prediction must be of sufficient accuracy to provide a value for uncertainty that could be used in a nuclear power plant probabilistic risk assessment (PRA). For the purposes of the project, reliability was defined to be the probability that the digital system will successfully perform its intended safety function (for the distribution of conditions under which it is expected to respond) upon demand with no unintended functions that might affect system safety. The ultimate objective is to use the identified measures to develop a method for predicting the potential quantitative reliability of a digital system. The reliability prediction models proposed in this report are conceptual in nature. That is, possible prediction techniques are proposed and trial models are built, but in order to become a useful tool for predicting reliability, the models must be tested, modified according to the results, and validated. Using methods outlined by this project, models could be constructed to develop reliability estimates for elements of software systems. This would require careful review and refinement of the models, development of model parameters from actual experience data or expert elicitation, and careful validation. By combining these reliability estimates (generated from the validated models for the constituent parts) in structural software models, the reliability of the software system could then be predicted. Modeling digital system reliability will also require that methods be developed for combining reliability estimates for hardware and software. System structural models must also be developed in order to predict system reliability based upon the reliability

  17. Driven Quantum Dynamics: Will It Blend?

    Directory of Open Access Journals (Sweden)

    Leonardo Banchi

    2017-10-01

    Full Text Available Randomness is an essential tool in many disciplines of modern sciences, such as cryptography, black hole physics, random matrix theory, and Monte Carlo sampling. In quantum systems, random operations can be obtained via random circuits thanks to so-called q-designs and play a central role in condensed-matter physics and in the fast scrambling conjecture for black holes. Here, we consider a more physically motivated way of generating random evolutions by exploiting the many-body dynamics of a quantum system driven with stochastic external pulses. We combine techniques from quantum control, open quantum systems, and exactly solvable models (via the Bethe ansatz to generate Haar-uniform random operations in driven many-body systems. We show that any fully controllable system converges to a unitary q-design in the long-time limit. Moreover, we study the convergence time of a driven spin chain by mapping its random evolution into a semigroup with an integrable Liouvillian and finding its gap. Remarkably, we find via Bethe-ansatz techniques that the gap is independent of q. We use mean-field techniques to argue that this property may be typical for other controllable systems, although we explicitly construct counterexamples via symmetry-breaking arguments to show that this is not always the case. Our findings open up new physical methods to transform classical randomness into quantum randomness, via a combination of quantum many-body dynamics and random driving.

  18. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  19. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  20. Creating a simulation model of software testing using Simulink package

    Directory of Open Access Journals (Sweden)

    V. M. Dubovoi

    2016-12-01

    Full Text Available The determination of the solution model of software testing that allows prediction both the whole process and its specific stages is actual for IT-industry. The article focuses on solving this problem. The aim of the article is prediction the time and improvement the quality of software testing. The analysis of the software testing process shows that it can be attributed to the branched cyclic technological processes because it is cyclical with decision-making on control operations. The investigation uses authors' previous works andsoftware testing process method based on Markov model. The proposed method enables execution the prediction for each software module, which leads to better decision-making of each controlled suboperation of all processes. Simulink simulation model shows implementation and verification of results of proposed technique. Results of the research have practically implemented in the IT-industry.

  1. Modeling of Some Chaotic Systems with AnyLogic Software

    Directory of Open Access Journals (Sweden)

    Biljana Zlatanovska

    2018-05-01

    Full Text Available The chaotic systems are already known in the theory of chaos. In our paper will be analyzed the following chaotic systems: Rossler, Chua and Chen systems. All of them are systems of ordinary differential equations. By mathematical software Mathematica and MatLab, their graphical representation as continuous dynamical systems is already known. By computer simulations, via examples, the systems will be analyzed using AnyLogic software. We would like to present the way how ordinary differential equations are modeling with AnyLogic software, as one of the simplest software for use.

  2. Development of a software for the curimeter model cdn102

    International Nuclear Information System (INIS)

    Dotres Llera, Armando

    2001-01-01

    The characteristics of the software for the Curimeter Model CD-N102 developed at CEADEN are presented. The software consists of two main parts: a basic software for the electrometer block and an application software for a P C. The basic software is totally independent of the Pc and performs all the basic functions of the process of measurement. The application software is optional and offers a friendlier interface and additional options to the user. Among these is the possibility to keep a statistical record of the measurements in a database, to create labels and to introduce new isotopes and calibrate them. A more detailed explanation of both software is given

  3. An Empirical Study of a Free Software Company

    OpenAIRE

    Pakusch, Cato

    2010-01-01

    Free software has matured well into the commercial software market, yet little qualitative research exists which accurately describes the state of commercial free software today. For this thesis, an instrumental case study was performed on a prominent free software company in Norway. The study found that the commercial free software market is largely driven by social networks, which have a social capital in its own that attracts more people, which in turn become members of the ...

  4. Modeling and managing risk early in software development

    Science.gov (United States)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  5. Contemporary Ecological Interactions Improve Models of Past Trait Evolution.

    Science.gov (United States)

    Hutchinson, Matthew C; Gaiarsa, Marília P; Stouffer, Daniel B

    2018-02-20

    Despite the fact that natural selection underlies both traits and interactions, evolutionary models often neglect that ecological interactions may, and in many cases do, influence the evolution of traits. Here, we explore the interdependence of ecological interactions and functional traits in the pollination associations of hawkmoths and flowering plants. Specifically, we develop an adaptation of the Ornstein-Uhlenbeck model of trait evolution that allows us to study the influence of plant corolla depth and observed hawkmoth-plant interactions on the evolution of hawkmoth proboscis length. Across diverse modelling scenarios, we find that the inclusion of contemporary interactions can provide a better description of trait evolution than the null expectation. Moreover, we show that the pollination interactions provide more-likely models of hawkmoth trait evolution when interactions are considered at increasingly finescale groups of hawkmoths. Finally, we demonstrate how the results of best-fit modelling approaches can implicitly support the association between interactions and trait evolution that our method explicitly examines. In showing that contemporary interactions can provide insight into the historical evolution of hawkmoth proboscis length, we demonstrate the clear utility of incorporating additional ecological information to models designed to study past trait evolution.

  6. Scientists' Needs in Software Ecosystem Modeling

    NARCIS (Netherlands)

    Jansen, Slinger; Handoyo, Eko; Alves, C.

    2015-01-01

    Currently the landscape of software ecosystem modelling methods and languages is like Babel after the fall of the tower: there are many methods and languages available and interchanging data between researchers and organizations that actively govern their ecosystem, is practically impossible. The

  7. The Biological Big Bang model for the major transitions in evolution.

    Science.gov (United States)

    Koonin, Eugene V

    2007-08-20

    Major transitions in biological evolution show the same pattern of sudden emergence of diverse forms at a new level of complexity. The relationships between major groups within an emergent new class of biological entities are hard to decipher and do not seem to fit the tree pattern that, following Darwin's original proposal, remains the dominant description of biological evolution. The cases in point include the origin of complex RNA molecules and protein folds; major groups of viruses; archaea and bacteria, and the principal lineages within each of these prokaryotic domains; eukaryotic supergroups; and animal phyla. In each of these pivotal nexuses in life's history, the principal "types" seem to appear rapidly and fully equipped with the signature features of the respective new level of biological organization. No intermediate "grades" or intermediate forms between different types are detectable. Usually, this pattern is attributed to cladogenesis compressed in time, combined with the inevitable erosion of the phylogenetic signal. I propose that most or all major evolutionary transitions that show the "explosive" pattern of emergence of new types of biological entities correspond to a boundary between two qualitatively distinct evolutionary phases. The first, inflationary phase is characterized by extremely rapid evolution driven by various processes of genetic information exchange, such as horizontal gene transfer, recombination, fusion, fission, and spread of mobile elements. These processes give rise to a vast diversity of forms from which the main classes of entities at the new level of complexity emerge independently, through a sampling process. In the second phase, evolution dramatically slows down, the respective process of genetic information exchange tapers off, and multiple lineages of the new type of entities emerge, each of them evolving in a tree-like fashion from that point on. This biphasic model of evolution incorporates the previously developed

  8. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  9. Reconfigurable network systems and software-defined networking

    OpenAIRE

    Zilberman, N.; Watts, P. M.; Rotsos, C.; Moore, A. W.

    2015-01-01

    Modern high-speed networks have evolved from relatively static networks to highly adaptive networks facilitating dynamic reconfiguration. This evolution has influenced all levels of network design and management, introducing increased programmability and configuration flexibility. This influence has extended from the lowest level of physical hardware interfaces to the highest level of network management by software. A key representative of this evolution is the emergence of software-defined n...

  10. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  11. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  12. Mathematical modeling of compression processes in air-driven boosters

    International Nuclear Information System (INIS)

    Li Zeyu; Zhao Yuanyang; Li Liansheng; Shu Pengcheng

    2007-01-01

    The compressed air in normal pressure is used as the source of power of the air-driven booster. The continuous working of air-driven boosters relies on the difference of surface area between driven piston and driving piston, i.e., the different forces acting on the pistons. When the working surface area of the driving piston for providing power is greater than that of the driven piston for compressing gas, the gas in compression chamber will be compressed. On the basis of the first law of thermodynamics, the motion regulation of piston is analyzed and the mathematical model of compression processes is set up. Giving a calculating example, the vary trends of gas pressure and pistons' move in working process of booster have been gotten. The change of parameters at different working conditions is also calculated and compared. And the corresponding results can be referred in the design of air-driven boosters

  13. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

    Directory of Open Access Journals (Sweden)

    Михаил Юрьевич Чернышов

    2013-12-01

    Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

  14. On the Problem of Attribute Selection for Software Cost Estimation: Input Backward Elimination Using Artificial Neural Networks

    OpenAIRE

    Papatheocharous , Efi; Andreou , Andreas S.

    2010-01-01

    International audience; Many parameters affect the cost evolution of software projects. In the area of software cost estimation and project management the main challenge is to understand and quantify the effect of these parameters, or 'cost drivers', on the effort expended to develop software systems. This paper aims at investigating the effect of cost attributes on software development effort using empirical databases of completed projects and building Artificial Neural Network (ANN) models ...

  15. Functional evolution of leptin of Ochotona curzoniae in adaptive thermogenesis driven by cold environmental stress.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available BACKGROUND: Environmental stress can accelerate the directional selection and evolutionary rate of specific stress-response proteins to bring about new or altered functions, enhancing an organism's fitness to challenging environments. Plateau pika (Ochotona curzoniae, an endemic and keystone species on Qinghai-Tibetan Plateau, is a high hypoxia and low temperature tolerant mammal with high resting metabolic rate and non-shivering thermogenesis to cope in this harsh plateau environment. Leptin is a key hormone related to how these animals regulate energy homeostasis. Previous molecular evolutionary analysis helped to generate the hypothesis that adaptive evolution of plateau pika leptin may be driven by cold stress. METHODOLOGY/PRINCIPAL FINDINGS: To test the hypothesis, recombinant pika leptin was first purified. The thermogenic characteristics of C57BL/6J mice injected with pika leptin under warm (23±1°C and cold (5±1°C acclimation is investigated. Expression levels of genes regulating adaptive thermogenesis in brown adipose tissue and the hypothalamus are compared between pika leptin and human leptin treatment, suggesting that pika leptin has adaptively and functionally evolved. Our results show that pika leptin regulates energy homeostasis via reduced food intake and increased energy expenditure under both warm and cold conditions. Compared with human leptin, pika leptin demonstrates a superior induced capacity for adaptive thermogenesis, which is reflected in a more enhanced β-oxidation, mitochondrial biogenesis and heat production. Moreover, leptin treatment combined with cold stimulation has a significant synergistic effect on adaptive thermogenesis, more so than is observed with a single cold exposure or single leptin treatment. CONCLUSIONS/SIGNIFICANCE: These findings support the hypothesis that cold stress has driven the functional evolution of plateau pika leptin as an ecological adaptation to the Qinghai-Tibetan Plateau.

  16. Customizing Standard Software as a Business Model in the IT Industry

    DEFF Research Database (Denmark)

    Kautz, Karlheinz; Rab, Sameen M.; Sinnet, Michael

    2011-01-01

    This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide an interpre......This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide...... an interpretive case study of a small software company which customizes a standard product. We investigate the company’s interactions with a large global software company which is the producer of the original software product and with other companies which are involved in the software customization process. We...... primarily on complex, formal partnerships, in which also opportunistic behavior occurs and where informal relations are invaluable sources of knowledge. In addition, the original software producer’s view and treatment of these companies has a vital impact on the customizing company’s practice which...

  17. EVOLUTION OF GASEOUS DISK VISCOSITY DRIVEN BY SUPERNOVA EXPLOSION. II. STRUCTURE AND EMISSIONS FROM STAR-FORMING GALAXIES AT HIGH REDSHIFT

    International Nuclear Information System (INIS)

    Yan Changshuo; Wang Jianmin

    2010-01-01

    High spatial resolution observations show that high-redshift galaxies are undergoing intensive evolution of dynamical structure and morphologies displayed by the Hα, Hβ, [O III], and [N II] images. It has been shown that supernova explosion (SNexp) of young massive stars during the star formation epoch, as kinetic feedback to host galaxies, can efficiently excite the turbulent viscosity. We incorporate the feedback into the dynamical equations through mass dropout and angular momentum transportation driven by the SNexp-excited turbulent viscosity. The empirical Kennicutt-Schmidt law is used for star formation rates (SFRs). We numerically solve the equations and show that there can be intensive evolution of structure of the gaseous disk. Secular evolution of the disk shows interesting characteristics: (1) high viscosity excited by SNexp can efficiently transport the gas from 10 kpc to ∼1 kpc forming a stellar disk whereas a stellar ring forms for the case with low viscosity; (2) starbursts trigger SMBH activity with a lag of ∼10 8 yr depending on SFRs, prompting the joint evolution of SMBHs and bulges; and (3) the velocity dispersion is as high as ∼100 km s -1 in the gaseous disk. These results are likely to vary with the initial mass function (IMF) that the SNexp rates rely on. Given the IMF, we use the GALAXEV code to compute the spectral evolution of stellar populations based on the dynamical structure. In order to compare the present models with the observed dynamical structure and images, we use the incident continuum from the simple stellar synthesis and CLOUDY to calculate emission line ratios of Hα, Hβ, [O III], and [N II], and Hα brightness of gas photoionized by young massive stars formed on the disks. The models can produce the main features of emission from star-forming galaxies. We apply the present model to two galaxies, BX 389 and BX 482 observed in the SINS high-z sample, which are bulge and disk-dominated, respectively. Two successive

  18. Model-driven requirements engineering (MDRE) for real-time ultra-wide instantaneous bandwidth signal simulation

    Science.gov (United States)

    Chang, Daniel Y.; Rowe, Neil C.

    2013-05-01

    While conducting a cutting-edge research in a specific domain, we realize that (1) requirements clarity and correctness are crucial to our success [1], (2) hardware is hard to change, most work is in software requirements development, coding and testing [2], (3) requirements are constantly changing, so that configurability, reusability, scalability, adaptability, modularity and testability are important non-functional attributes [3], (4) cross-domain knowledge is necessary for complex systems [4], and (5) if our research is successful, the results could be applied to other domains with similar problems. In this paper, we propose to use model-driven requirements engineering (MDRE) to model and guide our requirements/development, since models are easy to understand, execute, and modify. The domain for our research is Electronic Warfare (EW) real-time ultra-wide instantaneous bandwidth (IBW1) signal simulation. The proposed four MDRE models are (1) Switch-and-Filter architecture, (2) multiple parallel data bit streams alignment, (3) post-ADC and pre-DAC bits re-mapping, and (4) Discrete Fourier Transform (DFT) filter bank. This research is unique since the instantaneous bandwidth we are dealing with is in gigahertz range instead of conventional megahertz.

  19. Towards a sufficiency-driven business model : Experiences and opportunities

    NARCIS (Netherlands)

    Bocken, N.M.P.; Short, SW

    2016-01-01

    Business model innovation is an important lever for change to tackle pressing sustainability issues. In this paper, ‘sufficiency’ is proposed as a driver of business model innovation for sustainability. Sufficiency-driven business models seek to moderate overall resource consumption by curbing

  20. Design Driven Testing Test Smarter, Not Harder

    CERN Document Server

    Stephens, M

    2010-01-01

    The groundbreaking book Design Driven Testing brings sanity back to the software development process by flipping around the concept of Test Driven Development (TDD) - restoring the concept of using testing to verify a design instead of pretending that unit tests are a replacement for design. Anyone who feels that TDD is "Too Damn Difficult" will appreciate this book. Design Driven Testing shows that, by combining a forward-thinking development process with cutting-edge automation, testing can be a finely targeted, business-driven, rewarding effort. In other words, you'll learn how to test

  1. Challenges in software ecosystems research

    NARCIS (Netherlands)

    Serebrenik, A.; Mens, T.; Crnkovic, I.

    2015-01-01

    The paper is a meta-analysis of the research field of software ecosystems, by method of surveying 26 authors in the field. It presents a relevant list of literature and six themes in which challenges for software ecosystems can be grouped: Architecture and Design, Governance, Dynamics and Evolution,

  2. Assessing and improving the quality of model transformations

    NARCIS (Netherlands)

    Amstel, van M.F.

    2012-01-01

    Software is pervading our society more and more and is becoming increasingly complex. At the same time, software quality demands remain at the same, high level. Model-driven engineering (MDE) is a software engineering paradigm that aims at dealing with this increasing software complexity and

  3. A Model-Driven Approach to e-Course Management

    Science.gov (United States)

    Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana

    2018-01-01

    This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…

  4. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  5. Generic analysis of kinetically driven inflation

    Science.gov (United States)

    Saitou, Rio

    2018-04-01

    We perform a model-independent analysis of kinetically driven inflation (KDI) which (partially) includes generalized G-inflation and ghost inflation. We evaluate the background evolution splitting into the inflationary attractor and the perturbation around it. We also consider the quantum fluctuation of the scalar mode with a usual scaling and derive the spectral index, ignoring the contribution from the second-order products of slow-roll parameters. Using these formalisms, we find that within our generic framework the models of KDI which possess the shift symmetry of scalar field cannot create the quantum fluctuation consistent with the observation. Breaking the shift symmetry, we obtain a few essential conditions for viable models of KDI associated with the graceful exit.

  6. Predator-driven brain size evolution in natural populations of Trinidadian killifish (Rivulus hartii)

    Science.gov (United States)

    Walsh, Matthew R.; Broyles, Whitnee; Beston, Shannon M.; Munch, Stephan B.

    2016-01-01

    Vertebrates exhibit extensive variation in relative brain size. It has long been assumed that this variation is the product of ecologically driven natural selection. Yet, despite more than 100 years of research, the ecological conditions that select for changes in brain size are unclear. Recent laboratory selection experiments showed that selection for larger brains is associated with increased survival in risky environments. Such results lead to the prediction that increased predation should favour increased brain size. Work on natural populations, however, foreshadows the opposite trajectory of evolution; increased predation favours increased boldness, slower learning, and may thereby select for a smaller brain. We tested the influence of predator-induced mortality on brain size evolution by quantifying brain size variation in a Trinidadian killifish, Rivulus hartii, from communities that differ in predation intensity. We observed strong genetic differences in male (but not female) brain size between fish communities; second generation laboratory-reared males from sites with predators exhibited smaller brains than Rivulus from sites in which they are the only fish present. Such trends oppose the results of recent laboratory selection experiments and are not explained by trade-offs with other components of fitness. Our results suggest that increased male brain size is favoured in less risky environments because of the fitness benefits associated with faster rates of learning and problem-solving behaviour. PMID:27412278

  7. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...... consists of three main steps: automatically adding so-called derived pragmatics to the CPN model, computing an abstract template tree, which associates pragmatics with code templates, and applying the templates to generate code which can then be compiled. We illustrate our method using a unidirectional...

  8. Software life cycle dynamic simulation model: The organizational performance submodel

    Science.gov (United States)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  9. Software Piracy Detection Model Using Ant Colony Optimization Algorithm

    Science.gov (United States)

    Astiqah Omar, Nor; Zakuan, Zeti Zuryani Mohd; Saian, Rizauddin

    2017-06-01

    Internet enables information to be accessible anytime and anywhere. This scenario creates an environment whereby information can be easily copied. Easy access to the internet is one of the factors which contribute towards piracy in Malaysia as well as the rest of the world. According to a survey conducted by Compliance Gap BSA Global Software Survey in 2013 on software piracy, found out that 43 percent of the software installed on PCs around the world was not properly licensed, the commercial value of the unlicensed installations worldwide was reported to be 62.7 billion. Piracy can happen anywhere including universities. Malaysia as well as other countries in the world is faced with issues of piracy committed by the students in universities. Piracy in universities concern about acts of stealing intellectual property. It can be in the form of software piracy, music piracy, movies piracy and piracy of intellectual materials such as books, articles and journals. This scenario affected the owner of intellectual property as their property is in jeopardy. This study has developed a classification model for detecting software piracy. The model was developed using a swarm intelligence algorithm called the Ant Colony Optimization algorithm. The data for training was collected by a study conducted in Universiti Teknologi MARA (Perlis). Experimental results show that the model detection accuracy rate is better as compared to J48 algorithm.

  10. Theory of resistivity-gradient-driven turbulence

    International Nuclear Information System (INIS)

    Garcia, L.; Carreras, B.A.; Diamond, P.H.; Callen, J.D.

    1984-10-01

    A theory of the nonlinear evolution and saturation of resistivity-driven turbulence, which evolves from linear rippling instabilities, is presented. The nonlinear saturation mechanism is identified both analytically and numerically. Saturation occurs when the turbulent diffusion of the resistivity is large enough so that dissipation due to parallel electron thermal conduction balances the nonlinearly modified resistivity gradient driving term. The levels of potential, resistivity, and density fluctuations at saturation are calculated. A combination of computational modeling and analytic treatment is used in this investigation

  11. Evolution of Industry Knowledge in the Public Domain: Prior Art Searching for Software Patents

    Directory of Open Access Journals (Sweden)

    Jinseok Park

    2005-03-01

    Full Text Available Searching prior art is a key part of the patent application and examination processes. A comprehensive prior art search gives the inventor ideas as to how he can improve or circumvent existing technology by providing up to date knowledge on the state of the art. It also enables the patent applicant to minimise the likelihood of an objection from the patent office. This article explores the characteristics of prior art associated with software patents, dealing with difficulties in searching prior art due to the lack of resources, and considers public contribution to the formation of prior art databases. It addresses the evolution of electronic prior art in line with technological development, and discusses laws and practices in the EPO, USPTO, and the JPO in relation to the validity of prior art resources on the Internet. This article also investigates the main features of searching sources and tools in the three patent offices as well as non-patent literature databases. Based on the analysis of various searching databases, it provides some strategies of efficient prior art searching that should be considered for software-related inventions.

  12. Improving Agile Software Practice

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte

    2006-01-01

    Software process improvement in small and agile organizations is often problematic, but achieving good SPI-assessments can still be necessary to stay in the marked or to meet demands of multinational owners. The traditional norm driven, centralized and control centered improvement approaches has...

  13. Numerical Modeling of Large-Scale Rocky Coastline Evolution

    Science.gov (United States)

    Limber, P.; Murray, A. B.; Littlewood, R.; Valvo, L.

    2008-12-01

    Seventy-five percent of the world's ocean coastline is rocky. On large scales (i.e. greater than a kilometer), many intertwined processes drive rocky coastline evolution, including coastal erosion and sediment transport, tectonics, antecedent topography, and variations in sea cliff lithology. In areas such as California, an additional aspect of rocky coastline evolution involves submarine canyons that cut across the continental shelf and extend into the nearshore zone. These types of canyons intercept alongshore sediment transport and flush sand to abyssal depths during periodic turbidity currents, thereby delineating coastal sediment transport pathways and affecting shoreline evolution over large spatial and time scales. How tectonic, sediment transport, and canyon processes interact with inherited topographic and lithologic settings to shape rocky coastlines remains an unanswered, and largely unexplored, question. We will present numerical model results of rocky coastline evolution that starts with an immature fractal coastline. The initial shape is modified by headland erosion, wave-driven alongshore sediment transport, and submarine canyon placement. Our previous model results have shown that, as expected, an initial sediment-free irregularly shaped rocky coastline with homogeneous lithology will undergo smoothing in response to wave attack; headlands erode and mobile sediment is swept into bays, forming isolated pocket beaches. As this diffusive process continues, pocket beaches coalesce, and a continuous sediment transport pathway results. However, when a randomly placed submarine canyon is introduced to the system as a sediment sink, the end results are wholly different: sediment cover is reduced, which in turn increases weathering and erosion rates and causes the entire shoreline to move landward more rapidly. The canyon's alongshore position also affects coastline morphology. When placed offshore of a headland, the submarine canyon captures local sediment

  14. The Relationship of Personality Models and Development Tasks in Software Engineering

    OpenAIRE

    Wiesche, Manuel;Krcmar, Helmut

    2015-01-01

    Understanding the personality of software developers has been an ongoing topic in software engineering research. Software engineering researchers applied different theoretical models to understand software developers? personalities to better predict software developers? performance, orchestrate more effective and motivated teams, and identify the person that fits a certain job best. However, empirical results were found as contradicting, challenging validity, and missing guidance for IT perso...

  15. Software for people fundamentals, trends and best practices

    CERN Document Server

    Maedche, Alexander; Neer, Ludwig

    2012-01-01

    The highly competitive and globalized software market is creating pressure on software companies. Given the current boundary conditions, it is critical to continuously increase time-to-market and reduce development costs. In parallel, driven by private life experiences with mobile computing devices, the World Wide Web and software-based services, people, general expectations with regards to software are growing. They expect software that is simple and joyful to use. In the light of the changes that have taken place in recent years, software companies need to fundamentally reconsider the way th

  16. Data and Dynamics Driven Approaches for Modelling and Forecasting the Red Sea Chlorophyll

    KAUST Repository

    Dreano, Denis

    2017-01-01

    concentration and have practical applications for fisheries operation and harmful algae blooms monitoring. Modelling approaches can be divided between physics- driven (dynamical) approaches, and data-driven (statistical) approaches. Dynamical models are based

  17. modelling of directed evolution: Implications for experimental design and stepwise evolution

    OpenAIRE

    Wedge , David C.; Rowe , William; Kell , Douglas B.; Knowles , Joshua

    2009-01-01

    In silico modelling of directed evolution: Implications for experimental design and stepwise evolution correspondence: Corresponding author. Tel.: +441613065145. (Wedge, David C.) (Wedge, David C.) Manchester Interdisciplinary Biocentre, University of Manchester - 131 Princess Street--> , Manchester--> , M1 7ND--> - UNITED KINGDOM (Wedge, David C.) UNITED KINGDOM (Wedge, David C.) Man...

  18. Model-Driven Development of Safety Architectures

    Science.gov (United States)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2017-01-01

    We describe the use of model-driven development for safety assurance of a pioneering NASA flight operation involving a fleet of small unmanned aircraft systems (sUAS) flying beyond visual line of sight. The central idea is to develop a safety architecture that provides the basis for risk assessment and visualization within a safety case, the formal justification of acceptable safety required by the aviation regulatory authority. A safety architecture is composed from a collection of bow tie diagrams (BTDs), a practical approach to manage safety risk by linking the identified hazards to the appropriate mitigation measures. The safety justification for a given unmanned aircraft system (UAS) operation can have many related BTDs. In practice, however, each BTD is independently developed, which poses challenges with respect to incremental development, maintaining consistency across different safety artifacts when changes occur, and in extracting and presenting stakeholder specific information relevant for decision making. We show how a safety architecture reconciles the various BTDs of a system, and, collectively, provide an overarching picture of system safety, by considering them as views of a unified model. We also show how it enables model-driven development of BTDs, replete with validations, transformations, and a range of views. Our approach, which we have implemented in our toolset, AdvoCATE, is illustrated with a running example drawn from a real UAS safety case. The models and some of the innovations described here were instrumental in successfully obtaining regulatory flight approval.

  19. Data-Driven Model Order Reduction for Bayesian Inverse Problems

    KAUST Repository

    Cui, Tiangang; Youssef, Marzouk; Willcox, Karen

    2014-01-01

    One of the major challenges in using MCMC for the solution of inverse problems is the repeated evaluation of computationally expensive numerical models. We develop a data-driven projection- based model order reduction technique to reduce

  20. Multi-physics fluid-structure interaction modelling software

    CSIR Research Space (South Africa)

    Malan, AG

    2008-11-01

    Full Text Available -structure interaction modelling software AG MALAN AND O OXTOBY CSIR Defence, Peace, Safety and Security, PO Box 395, Pretoria, 0001 Email: amalan@csir.co.za – www.csir.co.za Internationally leading aerospace company Airbus sponsored key components... of the development of the CSIR fl uid-structure interaction (FSI) software. Below are extracts from their evaluation of the devel- oped technology: “The fi eld of FSI covers a massive range of engineering problems, each with their own multi-parameter, individual...

  1. Modeling Temporal Evolution and Multiscale Structure in Networks

    DEFF Research Database (Denmark)

    Herlau, Tue; Mørup, Morten; Schmidt, Mikkel Nørgaard

    2013-01-01

    Many real-world networks exhibit both temporal evolution and multiscale structure. We propose a model for temporally correlated multifurcating hierarchies in complex networks which jointly capture both effects. We use the Gibbs fragmentation tree as prior over multifurcating trees and a change......-point model to account for the temporal evolution of each vertex. We demonstrate that our model is able to infer time-varying multiscale structure in synthetic as well as three real world time-evolving complex networks. Our modeling of the temporal evolution of hierarchies brings new insights...

  2. Profile-driven regression for modeling and runtime optimization of mobile networks

    DEFF Research Database (Denmark)

    McClary, Dan; Syrotiuk, Violet; Kulahci, Murat

    2010-01-01

    Computer networks often display nonlinear behavior when examined over a wide range of operating conditions. There are few strategies available for modeling such behavior and optimizing such systems as they run. Profile-driven regression is developed and applied to modeling and runtime optimization...... of throughput in a mobile ad hoc network, a self-organizing collection of mobile wireless nodes without any fixed infrastructure. The intermediate models generated in profile-driven regression are used to fit an overall model of throughput, and are also used to optimize controllable factors at runtime. Unlike...

  3. Modular modeling with a computational twist in Metamod

    NARCIS (Netherlands)

    Sutii, A.-M.; Verhoeff, T.; Van Den Brand, M.G.J.

    2016-01-01

    Model-driven engineering (MDE) is a software development methodology that promises to alleviate the complex task of writing software. To achieve its goals, MDE makes use of models. Although models are concise representations of the knowledge in a domain, they can become large and complex. In dealing

  4. Beyond Reactive Planning: Self Adaptive Software and Self Modeling Software in Predictive Deliberation Management

    National Research Council Canada - National Science Library

    Lenahan, Jack; Nash, Michael P; Charles, Phil

    2008-01-01

    .... We present the following hypothesis: predictive deliberation management using self-adapting and self-modeling software will be required to provide mission planning adjustments after the start of a mission...

  5. Unit testing, model validation, and biological simulation [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Gopal P. Sarma

    2016-08-01

    Full Text Available The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  6. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  7. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  8. Model Driven Integrated Decision-Making in Manufacturing Enterprises

    Directory of Open Access Journals (Sweden)

    Richard H. Weston

    2012-01-01

    Full Text Available Decision making requirements and solutions are observed in four world class Manufacturing Enterprises (MEs. Observations made focus on deployed methods of complexity handling that facilitate multi-purpose, distributed decision making. Also observed are examples of partially deficient “integrated decision making” which stem from lack of understanding about how ME structural relations enable and/or constrain reachable ME behaviours. To begin to address this deficiency the paper outlines the use of a “reference model of ME decision making” which can inform the structural design of decision making systems in MEs. Also outlined is a “systematic model driven approach to modelling ME systems” which can particularise the reference model in specific case enterprises and thereby can “underpin integrated ME decision making”. Coherent decomposition and representational mechanisms have been incorporated into the model driven approach to systemise complexity handling. The paper also describes in outline an application of the modelling method in a case study ME and explains how its use has improved the integration of previously distinct planning functions. The modelling approach is particularly innovative in respect to the way it structures the coherent creation and experimental re-use of “fit for purpose” discrete event (predictive simulation models at the multiple levels of abstraction.

  9. Taylor dispersion in wind-driven current

    Science.gov (United States)

    Li, Gang; Wang, Ping; Jiang, Wei-Quan; Zeng, Li; Li, Zhi; Chen, G. Q.

    2017-12-01

    Taylor dispersion associated with wind-driven currents in channels, shallow lakes and estuaries is essential to hydrological environmental management. For solute dispersion in a wind-driven current, presented in this paper is an analytical study of the evolution of concentration distribution. The concentration moments are intensively derived for an accurate presentation of the mean concentration distribution, up to the effect of kurtosis. The vertical divergence of concentration is then deduced by Gill's method of series expansion up to the fourth order. Based on the temporal evolution of the vertical concentration distribution, the dispersion process in the wind-driven current is concretely characterized. The uniform shear leads to a special symmetrical distribution of mean concentration free of skewness. The non-uniformity of vertical concentration is caused by convection and smeared out gradually by the effect of diffusion, but fails to disappear even at large times.

  10. Generic Software Architecture for Launchers

    Science.gov (United States)

    Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre

    2015-09-01

    The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.

  11. Combining Domain-driven Design and Mashups for Service Development

    Science.gov (United States)

    Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni

    This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.

  12. The evolution of Lachancea thermotolerans is driven by geographical determination, anthropisation and flux between different ecosystems.

    Directory of Open Access Journals (Sweden)

    Ana Hranilovic

    Full Text Available The yeast Lachancea thermotolerans (formerly Kluyveromyces thermotolerans is a species with remarkable, yet underexplored, biotechnological potential. This ubiquist occupies a range of natural and anthropic habitats covering a wide geographic span. To gain an insight into L. thermotolerans population diversity and structure, 172 isolates sourced from diverse habitats worldwide were analysed using a set of 14 microsatellite markers. The resultant clustering revealed that the evolution of L. thermotolerans has been driven by the geography and ecological niche of the isolation sources. Isolates originating from anthropic environments, in particular grapes and wine, were genetically close, thus suggesting domestication events within the species. The observed clustering was further validated by several means including, population structure analysis, F-statistics, Mantel's test and the analysis of molecular variance (AMOVA. Phenotypic performance of isolates was tested using several growth substrates and physicochemical conditions, providing added support for the clustering. Altogether, this study sheds light on the genotypic and phenotypic diversity of L. thermotolerans, contributing to a better understanding of the population structure, ecology and evolution of this non-Saccharomyces yeast.

  13. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  14. Software Code Smell Prediction Model Using Shannon, Rényi and Tsallis Entropies

    Directory of Open Access Journals (Sweden)

    Aakanshi Gupta

    2018-05-01

    Full Text Available The current era demands high quality software in a limited time period to achieve new goals and heights. To meet user requirements, the source codes undergo frequent modifications which can generate the bad smells in software that deteriorate the quality and reliability of software. Source code of the open source software is easily accessible by any developer, thus frequently modifiable. In this paper, we have proposed a mathematical model to predict the bad smells using the concept of entropy as defined by the Information Theory. Open-source software Apache Abdera is taken into consideration for calculating the bad smells. Bad smells are collected using a detection tool from sub components of the Apache Abdera project, and different measures of entropy (Shannon, Rényi and Tsallis entropy. By applying non-linear regression techniques, the bad smells that can arise in the future versions of software are predicted based on the observed bad smells and entropy measures. The proposed model has been validated using goodness of fit parameters (prediction error, bias, variation, and Root Mean Squared Prediction Error (RMSPE. The values of model performance statistics ( R 2 , adjusted R 2 , Mean Square Error (MSE and standard error also justify the proposed model. We have compared the results of the prediction model with the observed results on real data. The results of the model might be helpful for software development industries and future researchers.

  15. Two-dimensional simulations of magnetically-driven instabilities

    International Nuclear Information System (INIS)

    Peterson, D.; Bowers, R.; Greene, A.E.; Brownell, J.

    1986-01-01

    A two-dimensional Eulerian MHD code is used to study the evolution of magnetically-driven instabilities in cylindrical geometry. The code incorporates an equation of state, resistivity, and radiative cooling model appropriate for an aluminum plasma. The simulations explore the effects of initial perturbations, electrical resistivity, and radiative cooling on the growth and saturation of the instabilities. Comparisons are made between the 2-D simulations, previous 1-D simulations, and results from the Pioneer experiments of the Los Alamos foil implosion program

  16. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.C.

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  17. 3D MODELLING BY LOW-COST RANGE CAMERA: SOFTWARE EVALUATION AND COMPARISON

    Directory of Open Access Journals (Sweden)

    R. Ravanelli

    2017-11-01

    Full Text Available The aim of this work is to present a comparison among three software applications currently available for the Occipital Structure SensorTM; all these software were developed for collecting 3D models of objects easily and in real-time with this structured light range camera. The SKANECT, itSeez3D and Scanner applications were thus tested: a DUPLOTM bricks construction was scanned with the three applications and the obtained models were compared to the model virtually generated with a standard CAD software, which served as reference. The results demonstrate that all the software applications are generally characterized by the same level of geometric accuracy, which amounts to very few millimetres. However, the itSeez3D software, which requires a payment of $7 to export each model, represents surely the best solution, both from the point of view of the geometric accuracy and, mostly, at the level of the color restitution. On the other hand, Scanner, which is a free software, presents an accuracy comparable to that of itSeez3D. At the same time, though, the colors are often smoothed and not perfectly overlapped to the corresponding part of the model. Lastly, SKANECT is the software that generates the highest number of points, but it has also some issues with the rendering of the colors.

  18. ORBITAL AND MASS RATIO EVOLUTION OF PROTOBINARIES DRIVEN BY MAGNETIC BRAKING

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Bo; Li, Zhi-Yun [Astronomy Department, University of Virginia, Charlottesville, VA 22904 (United States)

    2013-01-20

    The majority of stars reside in multiple systems, especially binaries. The formation and early evolution of binaries is a longstanding problem in star formation that is not yet fully understood. In particular, how the magnetic field observed in star-forming cores shapes the binary characteristics remains relatively unexplored. We demonstrate numerically, using an MHD version of the ENZO AMR hydro code, that a magnetic field of the observed strength can drastically change two of the basic quantities that characterize a binary system: the orbital separation and mass ratio of the two components. Our calculations focus on the protostellar mass accretion phase, after a pair of stellar 'seeds' have already formed. We find that in dense cores magnetized to a realistic level, the angular momentum of the material accreted by the protobinary is greatly reduced by magnetic braking. Accretion of strongly braked material shrinks the protobinary separation by a large factor compared to the non-magnetic case. The magnetic braking also changes the evolution of the mass ratio of unequal-mass protobinaries by producing material of low specific angular momentum that accretes preferentially onto the more massive primary star rather than the secondary. This is in contrast with the preferential mass accretion onto the secondary previously found numerically for protobinaries accreting from an unmagnetized envelope, which tends to drive the mass ratio toward unity. In addition, the magnetic field greatly modifies the morphology and dynamics of the protobinary accretion flow. It suppresses the traditional circumstellar and circumbinary disks that feed the protobinary in the non-magnetic case; the binary is fed instead by a fast collapsing pseudodisk whose rotation is strongly braked. The magnetic braking-driven inward migration of binaries from their birth locations may be constrained by high-resolution observations of the orbital distribution of deeply embedded protobinaries

  19. Data-driven non-Markovian closure models

    Science.gov (United States)

    Kondrashov, Dmitri; Chekroun, Mickaël D.; Ghil, Michael

    2015-03-01

    This paper has two interrelated foci: (i) obtaining stable and efficient data-driven closure models by using a multivariate time series of partial observations from a large-dimensional system; and (ii) comparing these closure models with the optimal closures predicted by the Mori-Zwanzig (MZ) formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a generalization and a time-continuous limit of existing multilevel, regression-based approaches to closure in a data-driven setting; these approaches include empirical model reduction (EMR), as well as more recent multi-layer modeling. It is shown that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the MZ formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are derived on the structure of the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a broad class of MSM applications, a class that includes non-polynomial predictors and nonlinearities that do not necessarily preserve quadratic energy invariants. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. It is shown that the resulting closure model with energy-conserving nonlinearities efficiently captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lotka-Volterra model of population dynamics in its chaotic regime. The challenges here include the rarity of strange attractors in the model's parameter

  20. Beginning SQL Server Modeling Model-driven Application Development in SQL Server

    CERN Document Server

    Weller, Bart

    2010-01-01

    Get ready for model-driven application development with SQL Server Modeling! This book covers Microsoft's SQL Server Modeling (formerly known under the code name "Oslo") in detail and contains the information you need to be successful with designing and implementing workflow modeling. Beginning SQL Server Modeling will help you gain a comprehensive understanding of how to apply DSLs and other modeling components in the development of SQL Server implementations. Most importantly, after reading the book and working through the examples, you will have considerable experience using SQL M

  1. The NTeQ ISD Model: A Tech-Driven Model for Digital Natives (DNs)

    Science.gov (United States)

    Williams, C.; Anekwe, J. U.

    2017-01-01

    Integrating Technology for enquiry (NTeQ) instructional development model (ISD), is believed to be a technology-driven model. The authors x-rayed the ten-step model to reaffirm the ICT knowledge demand of the learner and the educator; hence computer-based activities at various stages of the model are core elements. The model also is conscious of…

  2. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  3. Path generation algorithm for UML graphic modeling of aerospace test software

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  4. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  5. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  6. Bacterial community evolutions driven by organic matter and powder activated carbon in simultaneous anammox and denitrification (SAD) process.

    Science.gov (United States)

    Ge, Cheng-Hao; Sun, Na; Kang, Qi; Ren, Long-Fei; Ahmad, Hafiz Adeel; Ni, Shou-Qing; Wang, Zhibin

    2018-03-01

    A distinct shift of bacterial community driven by organic matter (OM) and powder activated carbon (PAC) was discovered in the simultaneous anammox and denitrification (SAD) process which was operated in an anti-fouling submerged anaerobic membrane bio-reactor. Based on anammox performance, optimal OM dose (50 mg/L) was advised to start up SAD process successfully. The results of qPCR and high throughput sequencing analysis indicated that OM played a key role in microbial community evolutions, impelling denitrifiers to challenge anammox's dominance. The addition of PAC not only mitigated the membrane fouling, but also stimulated the enrichment of denitrifiers, accounting for the predominant phylum changing from Planctomycetes to Proteobacteria in SAD process. Functional genes forecasts based on KEGG database and COG database showed that the expressions of full denitrification functional genes were highly promoted in R C , which demonstrated the enhanced full denitrification pathway driven by OM and PAC under low COD/N value (0.11). Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Software sensors based on the grey-box modelling approach

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.; Strube, Rune

    1996-01-01

    In recent years the grey-box modelling approach has been applied to wastewater transportation and treatment Grey-box models are characterized by the combination of deterministic and stochastic terms to form a model where all the parameters are statistically identifiable from the on......-box model for the specific dynamics is identified. Similarly, an on-line software sensor for detecting the occurrence of backwater phenomena can be developed by comparing the dynamics of a flow measurement with a nearby level measurement. For treatment plants it is found that grey-box models applied to on......-line measurements. With respect to the development of software sensors, the grey-box models possess two important features. Firstly, the on-line measurements can be filtered according to the grey-box model in order to remove noise deriving from the measuring equipment and controlling devices. Secondly, the grey...

  8. General Purpose Data-Driven Monitoring for Space Operations

    Science.gov (United States)

    Iverson, David L.; Martin, Rodney A.; Schwabacher, Mark A.; Spirkovska, Liljana; Taylor, William McCaa; Castle, Joseph P.; Mackey, Ryan M.

    2009-01-01

    As modern space propulsion and exploration systems improve in capability and efficiency, their designs are becoming increasingly sophisticated and complex. Determining the health state of these systems, using traditional parameter limit checking, model-based, or rule-based methods, is becoming more difficult as the number of sensors and component interactions grow. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults or failures. The Inductive Monitoring System (IMS) is a data-driven system health monitoring software tool that has been successfully applied to several aerospace applications. IMS uses a data mining technique called clustering to analyze archived system data and characterize normal interactions between parameters. The scope of IMS based data-driven monitoring applications continues to expand with current development activities. Successful IMS deployment in the International Space Station (ISS) flight control room to monitor ISS attitude control systems has led to applications in other ISS flight control disciplines, such as thermal control. It has also generated interest in data-driven monitoring capability for Constellation, NASA's program to replace the Space Shuttle with new launch vehicles and spacecraft capable of returning astronauts to the moon, and then on to Mars. Several projects are currently underway to evaluate and mature the IMS technology and complementary tools for use in the Constellation program. These include an experiment on board the Air Force TacSat-3 satellite, and ground systems monitoring for NASA's Ares I-X and Ares I launch vehicles. The TacSat-3 Vehicle System Management (TVSM) project is a software experiment to integrate fault

  9. On integrating modeling software for application to total-system performance assessment

    International Nuclear Information System (INIS)

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere

  10. Teaching and learning the Hodgkin-Huxley model based on software developed in NEURON's programming language hoc.

    Science.gov (United States)

    Hernández, Oscar E; Zurek, Eduardo E

    2013-05-15

    We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a Hodgkin-Huxley (HH) axon to be changed. The aim of this work is to develop a didactic and easy-to-use computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon.

  11. The effectiveness and efficiency of model driven game design

    NARCIS (Netherlands)

    Dormans, Joris

    2012-01-01

    In order for techniques from Model Driven Engineering to be accepted at large by the game industry, it is critical that the effectiveness and efficiency of these techniques are proven for game development. There is no lack of game design models, but there is no model that has surfaced as an industry

  12. Modeling of laser-driven hydrodynamics experiments

    Science.gov (United States)

    di Stefano, Carlos; Doss, Forrest; Rasmus, Alex; Flippo, Kirk; Desjardins, Tiffany; Merritt, Elizabeth; Kline, John; Hager, Jon; Bradley, Paul

    2017-10-01

    Correct interpretation of hydrodynamics experiments driven by a laser-produced shock depends strongly on an understanding of the time-dependent effect of the irradiation conditions on the flow. In this talk, we discuss the modeling of such experiments using the RAGE radiation-hydrodynamics code. The focus is an instability experiment consisting of a period of relatively-steady shock conditions in which the Richtmyer-Meshkov process dominates, followed by a period of decaying flow conditions, in which the dominant growth process changes to Rayleigh-Taylor instability. The use of a laser model is essential for capturing the transition. also University of Michigan.

  13. A Technology-Neutral Role-Based Collaboration Model for Software Ecosystems

    DEFF Research Database (Denmark)

    Stanciulescu, Stefan; Rabiser, Daniela; Seidl, Christoph

    2016-01-01

    by contributing a role-based collaboration model for software ecosystems to make such implicit similarities explicit and to raise awareness among developers during their ongoing efforts. We extract this model based on realization artifacts in a specific programming language located in a particular source code......In large-scale software ecosystems, many developers contribute extensions to a common software platform. Due to the independent development efforts and the lack of a central steering mechanism, similar functionality may be developed multiple times by different developers. We tackle this problem...... efforts and information of ongoing development efforts. Finally, using the collaborations defined in the formalism we model real artifacts from Marlin, a firmware for 3D printers, and we show that for the selected scenarios, the five collaborations were sufficient to raise awareness and make implicit...

  14. Test Driven Development of a Parameterized Ice Sheet Component

    Science.gov (United States)

    Clune, T.

    2011-12-01

    Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.

  15. AWARE: Adaptive Software Monitoring and Dynamic Reconfiguration for Critical Infrastructure Protection

    Science.gov (United States)

    2015-04-29

    in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software

  16. Model-Driven Policy Framework for Data Centers

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Kentis, Angelos Mimidis; Soler, José

    2016-01-01

    . Moreover, the lack of simple solutions for managing the configuration and behavior of the DC components makes the DC hard to configure and slow in adapting to changes in business needs. In this paper, we propose a model-driven framework for policy-based management for DCs, to simplify not only the service...

  17. CyberGIS software: a synthetic review and integration roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shaowen [University of Illinois, Urbana-Champaign; Anselin, Luc [Arizona State University; Bhaduri, Budhendra L [ORNL; Cosby, Christopher [University Navstar Consortium, Boulder, CO; Goodchild, Michael [University of California, Santa Barbara; Liu, Yan [University of Illinois, Urbana-Champaign; Nygers, Timothy L. [University of Washington, Seattle

    2013-01-01

    CyberGIS defined as cyberinfrastructure-based geographic information systems (GIS) has emerged as a new generation of GIS representing an important research direction for both cyberinfrastructure and geographic information science. This study introduces a 5-year effort funded by the US National Science Foundation to advance the science and applications of CyberGIS, particularly for enabling the analysis of big spatial data, computationally intensive spatial analysis and modeling (SAM), and collaborative geospatial problem-solving and decision-making, simultaneously conducted by a large number of users. Several fundamental research questions are raised and addressed while a set of CyberGIS challenges and opportunities are identified from scientific perspectives. The study reviews several key CyberGIS software tools that are used to elucidate a vision and roadmap for CyberGIS software research. The roadmap focuses on software integration and synthesis of cyberinfrastructure, GIS, and SAM by defining several key integration dimensions and strategies. CyberGIS, based on this holistic integration roadmap, exhibits the following key characteristics: high-performance and scalable, open and distributed, collaborative, service-oriented, user-centric, and community-driven. As a major result of the roadmap, two key CyberGIS modalities gateway and toolkit combined with a community-driven and participatory approach have laid a solid foundation to achieve scientific breakthroughs across many geospatial communities that would be otherwise impossible.

  18. Evolutive masing model, cyclic plasticity, ageing and memory effects

    International Nuclear Information System (INIS)

    Sidoroff, F.

    1987-01-01

    Many models are proposed for the mechanical description of the cyclic behaviour of metals and used for structure analysis under cyclic loading. Such a model must include two basic features: Dissipative behaviour on each cycle (hysteresis loop); evolution of this behaviour during the material's life (cyclic hardening or softening, aging,...). However, if both aspects are present in most existing models, the balance between them may be quite different. Many metallurgical investigations have been performed about the microstructure and its evolution during cyclic loading, and it is desirable to introduce these informations in phenomenological models. The evolutive Masing model has been proposed to combine: the accuracy of hereditary models for the description of hysteresis on each cycle, the versatility of internal variables for the state description and evolution, a sufficient microstructural basis to make the interaction easier with microstructural investigations. The purpose of the present work is to discuss this model and to compare different evolution assumptions with respect to some memory effects (cyclic hardening and softening, multilevel tests, aging). Attention is limited to uniaxial, rate independent elasto-plastic behaviour

  19. The mineralogic evolution of the Martian surface through time: Implications from chemical reaction path modeling studies

    Science.gov (United States)

    Plumlee, G. S.; Ridley, W. I.; Debraal, J. D.; Reed, M. H.

    1993-01-01

    Chemical reaction path calculations were used to model the minerals that might have formed at or near the Martian surface as a result of volcano or meteorite impact driven hydrothermal systems; weathering at the Martian surface during an early warm, wet climate; and near-zero or sub-zero C brine-regolith reactions in the current cold climate. Although the chemical reaction path calculations carried out do not define the exact mineralogical evolution of the Martian surface over time, they do place valuable geochemical constraints on the types of minerals that formed from an aqueous phase under various surficial and geochemically complex conditions.

  20. Speed Geometric Quantum Logical Gate Based on Double-Hamiltonian Evolution under Large-Detuning Cavity QED Model

    International Nuclear Information System (INIS)

    Chen Changyong; Liu Zongliang; Kang Shuai; Li Shaohua

    2010-01-01

    We introduce the double-Hamiltonian evolution technique approach to investigate the unconventional geometric quantum logical gate with dissipation under the model of many identical three-level atoms in a cavity, driven by a classical field. Our concrete calculation is made for the case of two atoms for the large-detuning interaction of the atoms with the cavity mode. The main advantage of our scheme is of eliminating the photon flutuation in the cavity mode during the gating. The corresponding analytical results will be helpful for experimental realization of speed geometric quantum logical gate in real cavities. (general)

  1. Software engineering the mixed model for genome-wide association studies on large samples.

    Science.gov (United States)

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  2. Confronting Models of Massive Star Evolution and Explosions with Remnant Mass Measurements

    Science.gov (United States)

    Raithel, Carolyn A.; Sukhbold, Tuguldur; Özel, Feryal

    2018-03-01

    The mass distribution of compact objects provides a fossil record that can be studied to uncover information on the late stages of massive star evolution, the supernova explosion mechanism, and the dense matter equation of state. Observations of neutron star masses indicate a bimodal Gaussian distribution, while the observed black hole mass distribution decays exponentially for stellar-mass black holes. We use these observed distributions to directly confront the predictions of stellar evolution models and the neutrino-driven supernova simulations of Sukhbold et al. We find strong agreement between the black hole and low-mass neutron star distributions created by these simulations and the observations. We show that a large fraction of the stellar envelope must be ejected, either during the formation of stellar-mass black holes or prior to the implosion through tidal stripping due to a binary companion, in order to reproduce the observed black hole mass distribution. We also determine the origins of the bimodal peaks of the neutron star mass distribution, finding that the low-mass peak (centered at ∼1.4 M ⊙) originates from progenitors with M ZAMS ≈ 9–18 M ⊙. The simulations fail to reproduce the observed peak of high-mass neutron stars (centered at ∼1.8 M ⊙) and we explore several possible explanations. We argue that the close agreement between the observed and predicted black hole and low-mass neutron star mass distributions provides new, promising evidence that these stellar evolution and explosion models capture the majority of relevant stellar, nuclear, and explosion physics involved in the formation of compact objects.

  3. The Robust Software Feedback Model: An Effective Waterfall Model Tailoring for Space SW

    Science.gov (United States)

    Tipaldi, Massimo; Gotz, Christoph; Ferraguto, Massimo; Troiano, Luigi; Bruenjes, Bernhard

    2013-08-01

    The selection of the most suitable software life cycle process is of paramount importance in any space SW project. Despite being the preferred choice, the waterfall model is often exposed to some criticism. As matter of fact, its main assumption of moving to a phase only when the preceding one is completed and perfected (and under the demanding SW schedule constraints) is not easily attainable. In this paper, a tailoring of the software waterfall model (named “Robust Software Feedback Model”) is presented. The proposed methodology sorts out these issues by combining a SW waterfall model with a SW prototyping approach. The former is aligned with the SW main production line and is based on the full ECSS-E-ST-40C life-cycle reviews, whereas the latter is carried out in advance versus the main SW streamline (so as to inject its lessons learnt into the main streamline) and is based on a lightweight approach.

  4. Chaos and unpredictability in evolution.

    Science.gov (United States)

    Doebeli, Michael; Ispolatov, Iaroslav

    2014-05-01

    The possibility of complicated dynamic behavior driven by nonlinear feedbacks in dynamical systems has revolutionized science in the latter part of the last century. Yet despite examples of complicated frequency dynamics, the possibility of long-term evolutionary chaos is rarely considered. The concept of "survival of the fittest" is central to much evolutionary thinking and embodies a perspective of evolution as a directional optimization process exhibiting simple, predictable dynamics. This perspective is adequate for simple scenarios, when frequency-independent selection acts on scalar phenotypes. However, in most organisms many phenotypic properties combine in complicated ways to determine ecological interactions, and hence frequency-dependent selection. Therefore, it is natural to consider models for evolutionary dynamics generated by frequency-dependent selection acting simultaneously on many different phenotypes. Here we show that complicated, chaotic dynamics of long-term evolutionary trajectories in phenotype space is very common in a large class of such models when the dimension of phenotype space is large, and when there are selective interactions between the phenotypic components. Our results suggest that the perspective of evolution as a process with simple, predictable dynamics covers only a small fragment of long-term evolution. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.

  5. Multiphysics software and the challenge to validating physical models

    International Nuclear Information System (INIS)

    Luxat, J.C.

    2008-01-01

    This paper discusses multi physics software and validation of physical models in the nuclear industry. The major challenge is to convert the general purpose software package to a robust application-specific solution. This requires greater knowledge of the underlying solution techniques and the limitations of the packages. Good user interfaces and neat graphics do not compensate for any deficiencies

  6. A Survey of Software Reliability Modeling and Estimation

    Science.gov (United States)

    1983-09-01

    considered include: the Jelinski-Moranda Model, the ,Geometric Model,’ and Musa’s Model. A Monte -Carlo study of the behavior of the ’V"’"*least squares...ceedings Number 261, 1979, pp. 34-1, 34-11. IoelAmrit, AGieboSSukert, Alan and Goel, Ararat , "A Guidebookfor Software Reliability Assessment, 1980

  7. Software Updating in Wireless Sensor Networks: A Survey and Lacunae

    Directory of Open Access Journals (Sweden)

    Cormac J. Sreenan

    2013-11-01

    Full Text Available Wireless Sensor Networks are moving out of the laboratory and into the field. For a number of reasons there is often a need to update sensor node software, or node configuration, after deployment. The need for over-the-air updates is driven both by the scale of deployments, and by the remoteness and inaccessibility of sensor nodes. This need has been recognized since the early days of sensor networks, and research results from the related areas of mobile networking and distributed systems have been applied to this area. In order to avoid any manual intervention, the update process needs to be autonomous. This paper presents a comprehensive survey of software updating in Wireless Sensor Networks, and analyses the features required to make these updates autonomous. A new taxonomy of software update features and a new model for fault detection and recovery are presented. The paper concludes by identifying the lacunae relating to autonomous software updates, providing direction for future research.

  8. Coevolution of variability models and related software artifacts

    DEFF Research Database (Denmark)

    Passos, Leonardo; Teixeira, Leopoldo; Dinztner, Nicolas

    2015-01-01

    models coevolve with other artifact types, we study a large and complex real-world variant-rich software system: the Linux kernel. Specifically, we extract variability-coevolution patterns capturing changes in the variability model of the Linux kernel with subsequent changes in Makefiles and C source...

  9. The Evolution of Open Magnetic Flux Driven by Photospheric Dynamics

    Science.gov (United States)

    Linker, Jon A.; Lionello, Roberto; Mikic, Zoran; Titov, Viacheslav S.; Antiochos, Spiro K.

    2010-01-01

    The coronal magnetic field is of paramount importance in solar and heliospheric physics. Two profoundly different views of the coronal magnetic field have emerged. In quasi-steady models, the predominant source of open magnetic field is in coronal holes. In contrast, in the interchange model, the open magnetic flux is conserved, and the coronal magnetic field can only respond to the photospheric evolution via interchange reconnection. In this view the open magnetic flux diffuses through the closed, streamer belt fields, and substantial open flux is present in the streamer belt during solar minimum. However, Antiochos and co-workers, in the form of a conjecture, argued that truly isolated open flux cannot exist in a configuration with one heliospheric current sheet (HCS) - it will connect via narrow corridors to the polar coronal hole of the same polarity. This contradicts the requirements of the interchange model. We have performed an MHD simulation of the solar corona up to 20R solar to test both the interchange model and the Antiochos conjecture. We use a synoptic map for Carrington Rotation 1913 as the boundary condition for the model, with two small bipoles introduced into the region where a positive polarity extended coronal hole forms. We introduce flows at the photospheric boundary surface to see if open flux associated with the bipoles can be moved into the closed-field region. Interchange reconnection does occur in response to these motions. However, we find that the open magnetic flux cannot be simply injected into closed-field regions - the flux eventually closes down and disconnected flux is created. Flux either opens or closes, as required, to maintain topologically distinct open and closed field regions, with no indiscriminate mixing of the two. The early evolution conforms to the Antiochos conjecture in that a narrow corridor of open flux connects the portion of the coronal hole that is nearly detached by one of the bipoles. In the later evolution, a

  10. THE EVOLUTION OF OPEN MAGNETIC FLUX DRIVEN BY PHOTOSPHERIC DYNAMICS

    International Nuclear Information System (INIS)

    Linker, Jon A.; Lionello, Roberto; Mikic, Zoran; Titov, Viacheslav S.; Antiochos, Spiro K.

    2011-01-01

    The coronal magnetic field is of paramount importance in solar and heliospheric physics. Two profoundly different views of the coronal magnetic field have emerged. In quasi-steady models, the predominant source of open magnetic field is in coronal holes. In contrast, in the interchange model, the open magnetic flux is conserved, and the coronal magnetic field can only respond to the photospheric evolution via interchange reconnection. In this view, the open magnetic flux diffuses through the closed, streamer belt fields, and substantial open flux is present in the streamer belt during solar minimum. However, Antiochos and coworkers, in the form of a conjecture, argued that truly isolated open flux cannot exist in a configuration with one heliospheric current sheet-it will connect via narrow corridors to the polar coronal hole of the same polarity. This contradicts the requirements of the interchange model. We have performed an MHD simulation of the solar corona up to 20 R sun to test both the interchange model and the Antiochos conjecture. We use a synoptic map for Carrington rotation 1913 as the boundary condition for the model, with two small bipoles introduced into the region where a positive polarity extended coronal hole forms. We introduce flows at the photospheric boundary surface to see if open flux associated with the bipoles can be moved into the closed-field region. Interchange reconnection does occur in response to these motions. However, we find that the open magnetic flux cannot be simply injected into closed-field regions-the flux eventually closes down and disconnected flux is created. Flux either opens or closes, as required, to maintain topologically distinct open- and closed-field regions, with no indiscriminate mixing of the two. The early evolution conforms to the Antiochos conjecture in that a narrow corridor of open flux connects the portion of the coronal hole that is nearly detached by one of the bipoles. In the later evolution, a detached

  11. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    Science.gov (United States)

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  12. Tag-Driven Online Novel Recommendation with Collaborative Item Modeling

    Directory of Open Access Journals (Sweden)

    Fenghuan Li

    2018-04-01

    Full Text Available Online novel recommendation recommends attractive novels according to the preferences and characteristics of users or novels and is increasingly touted as an indispensable service of many online stores and websites. The interests of the majority of users remain stable over a certain period. However, there are broad categories in the initial recommendation list achieved by collaborative filtering (CF. That is to say, it is very possible that there are many inappropriately recommended novels. Meanwhile, most algorithms assume that users can provide an explicit preference. However, this assumption does not always hold, especially in online novel reading. To solve these issues, a tag-driven algorithm with collaborative item modeling (TDCIM is proposed for online novel recommendation. Online novel reading is different from traditional book marketing and lacks preference rating. In addition, collaborative filtering frequently suffers from the Matthew effect, leading to ignored personalized recommendations and serious long tail problems. Therefore, item-based CF is improved by latent preference rating with a punishment mechanism based on novel popularity. Consequently, a tag-driven algorithm is constructed by means of collaborative item modeling and tag extension. Experimental results show that online novel recommendation is improved greatly by a tag-driven algorithm with collaborative item modeling.

  13. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    Science.gov (United States)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  14. Modelling dune evolution and dynamic roughness in rivers

    NARCIS (Netherlands)

    Paarlberg, Andries

    2008-01-01

    Accurate river flow models are essential tools for water managers, but these hydraulic simulation models often lack a proper description of dynamic roughness due to hysteresis effects in dune evolution. To incorporate the effects of dune evolution directly into the resistance coefficients of

  15. Staying in the Light: Evaluating Sustainability Models for Brokering Software

    Science.gov (United States)

    Powers, L. A.; Benedict, K. K.; Best, M.; Fyfe, S.; Jacobs, C. A.; Michener, W. K.; Pearlman, J.; Turner, A.; Nativi, S.

    2015-12-01

    The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and obstacles, and policy and legal considerations. The issue of sustainability is not unique to brokering software and these models may be relevant to many applications. Results of this comprehensive analysis highlight advantages and disadvantages of the various models in respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis while recognizing that all software is part of an evolutionary process and has a lifespan.

  16. Benefits of reverse engineering technologies in software development makerspace

    Directory of Open Access Journals (Sweden)

    Aabidi M.H.

    2017-01-01

    Full Text Available In the recent decades, the amount of data produced by scientific, engineering, and life science applications has increased with several orders of magnitude. In parallel with this development, the applications themselves have become increasingly complex in terms of functionality, structure, and behavior. In the same time, development and production cycles of such applications exhibit a tendency of becoming increasingly shorter, due to factors such as market pressure and rapid evolution of supporting and enabling technologies. As a consequence, an increasing fraction of the cost of creating new applications and manufacturing processes shifts from the creation of new artifacts to the adaption of existing ones. A key component of this activity is the understanding of the design, operation, and behavior of existing manufactured artifacts, such as software code bases, hardware systems, and mechanical assemblies. For instance, in the software industry, it is estimated that maintenance costs exceed 80% of the total costs of a software product's lifecycle, and software understanding accounts for as much as half of these maintenance costs. To facilitate the software development process, it would be ideal to have tools that automatically generate or help to generate UML (Unified Modeling Language models from source code. Reverse engineering the software architecture from source code provides a valuable service to software practitioners. Case tools implementing MDA and reverse-engineering constitute an important opportunity of software development engineers. So MDA and reverse engineering is an important key witch make makerspace more productive and more efficient.

  17. Data-Driven Model Order Reduction for Bayesian Inverse Problems

    KAUST Repository

    Cui, Tiangang

    2014-01-06

    One of the major challenges in using MCMC for the solution of inverse problems is the repeated evaluation of computationally expensive numerical models. We develop a data-driven projection- based model order reduction technique to reduce the computational cost of numerical PDE evaluations in this context.

  18. Modeling the Thermosphere as a Driven-Dissipative Thermodynamic System

    Science.gov (United States)

    2013-03-01

    8 Figure 2: Illustration of the geocentric solar magnetospheric coordinate system............15 Figure 3: Diagram of the...to test new methods of modeling the thermospheric environment. Thermosphere as a Driven-Dissipative Thermodynamic System One approach for modeling... approach uses empirical coupling and relaxation constants to model the 4 input of energy to the thermosphere from the solar wind during

  19. EREM: Parameter Estimation and Ancestral Reconstruction by Expectation-Maximization Algorithm for a Probabilistic Model of Genomic Binary Characters Evolution.

    Science.gov (United States)

    Carmel, Liran; Wolf, Yuri I; Rogozin, Igor B; Koonin, Eugene V

    2010-01-01

    Evolutionary binary characters are features of species or genes, indicating the absence (value zero) or presence (value one) of some property. Examples include eukaryotic gene architecture (the presence or absence of an intron in a particular locus), gene content, and morphological characters. In many studies, the acquisition of such binary characters is assumed to represent a rare evolutionary event, and consequently, their evolution is analyzed using various flavors of parsimony. However, when gain and loss of the character are not rare enough, a probabilistic analysis becomes essential. Here, we present a comprehensive probabilistic model to describe the evolution of binary characters on a bifurcating phylogenetic tree. A fast software tool, EREM, is provided, using maximum likelihood to estimate the parameters of the model and to reconstruct ancestral states (presence and absence in internal nodes) and events (gain and loss events along branches).

  20. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    Science.gov (United States)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.