WorldWideScience

Sample records for expandable software model

  1. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  2. Software Cost-Estimation Model

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.

  3. Explaining Embedded Software Modelling Decisions

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    As today’s devices, gadgets and machines become more intelligent, the complexity of embedded software controlling them grows enormously. To deal with this complexity, embedded software is designed using model-based paradigms. The process of modelling is a combination of formal and creative, design

  4. Modelling Software for Structure Metrics

    NARCIS (Netherlands)

    van den Broek, P.M.; van den Berg, Klaas

    In the approach to structural software metrics, software is modelled by means of flowgraphs. A tacit assumption in this approach is that the structure of a program is reflected by the structure of the flowgraph. When only the flow of control between commands is considered this assumption is valid;

  5. Energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-01-01

    Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...

  6. The art of software modeling

    CERN Document Server

    Lieberman, Benjamin A

    2007-01-01

    Modeling complex systems is a difficult challenge and all too often one in which modelers are left to their own devices. Using a multidisciplinary approach, The Art of Software Modeling covers theory, practice, and presentation in detail. It focuses on the importance of model creation and demonstrates how to create meaningful models. Presenting three self-contained sections, the text examines the background of modeling and frameworks for organizing information. It identifies techniques for researching and capturing client and system information and addresses the challenges of presenting models to specific audiences. Using concepts from art theory and aesthetics, this broad-based approach encompasses software practices, cognitive science, and information presentation. The book also looks at perception and cognition of diagrams, view composition, color theory, and presentation techniques. Providing practical methods for investigating and organizing complex information, The Art of Software Modeling demonstrate...

  7. FOAM: Expanding the horizons of climate modeling

    Energy Technology Data Exchange (ETDEWEB)

    Tobis, M.; Foster, I.T.; Schafer, C.M. [and others

    1997-10-01

    We report here on a project that expands the applicability of dynamic climate modeling to very long time scales. The Fast Ocean Atmosphere Model (FOAM) is a coupled ocean atmosphere model that incorporates physics of interest in understanding decade to century time scale variability. It addresses the high computational cost of this endeavor with a combination of improved ocean model formulation, low atmosphere resolution, and efficient coupling. It also uses message passing parallel processing techniques, allowing for the use of cost effective distributed memory platforms. The resulting model runs over 6000 times faster than real time with good fidelity, and has yielded significant results.

  8. Software Development Risk Management Model

    OpenAIRE

    Islam, Shareeful

    2011-01-01

    Risk management is an effective tool to control risks in software projects and increases the likelihood of project success. Risk management needs to be integrated as early as possible in the project. This dissertation proposes a Goal-driven Software Development Risk Management Model (GSRM) and explicitly integrates it into requirements engineering phase. This integration provides an early warning of potential problems so that both preventive and corrective actions can be undertaken to avoid t...

  9. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  10. DFI Computer Modeling Software (CMS)

    Energy Technology Data Exchange (ETDEWEB)

    Cazalet, E.G.; Deziel, L.B. Jr.; Haas, S.M.; Martin, T.W.; Nesbitt, D.M.; Phillips, R.L.

    1979-10-01

    The data base management system used to create, edit and store models data and solutions for the LEAP system is described. The software is entirely in FORTRAN-G for the IBM 370 series of computers and provides interface with a commercial data base system SYSTEM-2000.

  11. Motorola Secure Software Development Model

    Directory of Open Access Journals (Sweden)

    Francis Mahendran

    2008-08-01

    Full Text Available In today's world, the key to meeting the demand for improved security is to implement repeatable processes that reliably deliver measurably improved security. While many organizations have announced efforts to institutionalize a secure software development process, there is little or no industry acceptance for a common process improvement framework for secure software development. Motorola has taken the initiative to develop such a framework, and plans to share this with the Software Engineering Institute for possible inclusion into its Capability Maturity Model Integration (CMMI®. This paper will go into the details of how Motorola is addressing this issue. The model that is being developed is designed as an extension of the existing CMMI structure. The assumption is that the audience will have a basic understanding of the SEI CMM® / CMMI® process framework. The paper will not describe implementation details of a security process model or improvement framework, but will address WHAT security practices are required for a company with many organizations operating at different maturity levels. It is left to the implementing organization to answer the HOW, WHEN, WHO and WHERE aspects. The paper will discuss how the model is being implemented in the Motorola Software Group.

  12. Expanding on Successful Concepts, Models, and Organization

    Energy Technology Data Exchange (ETDEWEB)

    Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.; Leonard, Jeremy A.; Anderson, Kim A.; Corley, Richard A.; Kile, Molly L.; L. Massey Simonich, Staci; Stone, David; Tanguay, Robert L.; Waters, Katrina M.; Harper, Stacey L.; Williams, David E.

    2016-09-06

    In her letter to the editor1 regarding our recent Feature Article “Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework” 2, Dr. von Göetz expressed several concerns about terminology, and the perception that we propose the replacement of successful approaches and models for exposure assessment with a concept. We are glad to have the opportunity to address these issues here. If the goal of the AEP framework was to replace existing exposure models or databases for organizing exposure data with a concept, we would share Dr. von Göetz concerns. Instead, the outcome we promote is broader use of an organizational framework for exposure science. The framework would support improved generation, organization, and interpretation of data as well as modeling and prediction, not replacement of models. The field of toxicology has seen the benefits of wide use of one or more organizational frameworks (e.g., mode and mechanism of action, adverse outcome pathway). These frameworks influence how experiments are designed, data are collected, curated, stored and interpreted and ultimately how data are used in risk assessment. Exposure science is poised to similarly benefit from broader use of a parallel organizational framework, which Dr. von Göetz correctly points out, is currently used in the exposure modeling community. In our view, the concepts used so effectively in the exposure modeling community, expanded upon in the AEP framework, could see wider adoption by the field as a whole. The value of such a framework was recognized by the National Academy of Sciences.3 Replacement of models, databases, or any application with the AEP framework was not proposed in our article. The positive role broader more consistent use of such a framework might have in enabling and advancing “general activities such as data acquisition, organization…,” and exposure modeling was discussed

  13. Software Model Checking for Verifying Distributed Algorithms

    Science.gov (United States)

    2014-10-28

    2014 Carnegie Mellon University Software Model Checking for Verifying Distributed Algorithms Sagar Chaki, James Edmondson October 28, 2014...SUBTITLE Software Model Checking for Verifying Distributed Algorithms 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...Program Software Model Checking (CBMC, BLAST etc.) Failure Success Program in Domain Specific Language Automatic verification technique for finite

  14. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business....... This thesis documents the groundwork towards addressing the challenges faced by telemedical technologies today and establishing telemedicine as a means of patient diagnosis and treatment. Furthermore, it serves as an empirical example of designing a software ecosystem....

  15. Numerical modelling of multi-vane expander operating conditions in ORC system

    Science.gov (United States)

    Rak, Józef; Błasiak, Przemysław; Kolasiński, Piotr

    2017-11-01

    Multi-vane expanders are positive displacement volumetric machines which are nowadays considered for application in micro-power domestic ORC systems as promising alternative to micro turbines and other volumetric expanders. The multi-vane expander features very simple design, low gas flow capacity, low expansion ratios, an advantageous ratio of the power output to the external dimensions and are insensitive to the negative influence of the gas-liquid mixture expansion. Moreover, the multi-vane expander can be easily hermetically sealed, which is one of the key issues in the ORC system design. A literature review indicates that issues concerning the application of multi-vane expanders in such systems, especially related to operating of multi-vane expander with different low-boiling working fluids, are innovative, not fully scientifically described and have the potential for practical implementation. In this paper the results of numerical investigations on multi-vane expander operating conditions are presented. The analyses were performed on three-dimensional numerical model of the expander in ANSYS CFX software. The numerical model of the expander was validated using the data obtained from the experiment carried out on a lab test-stand. Then a series of computational analysis were performed using expanders' numerical model in order to determine its operating conditions under various flow conditions of different working fluids.

  16. Numerical modelling of multi-vane expander operating conditions in ORC system

    Directory of Open Access Journals (Sweden)

    Rak Józef

    2017-01-01

    Full Text Available Multi-vane expanders are positive displacement volumetric machines which are nowadays considered for application in micro-power domestic ORC systems as promising alternative to micro turbines and other volumetric expanders. The multi-vane expander features very simple design, low gas flow capacity, low expansion ratios, an advantageous ratio of the power output to the external dimensions and are insensitive to the negative influence of the gas-liquid mixture expansion. Moreover, the multi-vane expander can be easily hermetically sealed, which is one of the key issues in the ORC system design. A literature review indicates that issues concerning the application of multi-vane expanders in such systems, especially related to operating of multi-vane expander with different low-boiling working fluids, are innovative, not fully scientifically described and have the potential for practical implementation. In this paper the results of numerical investigations on multi-vane expander operating conditions are presented. The analyses were performed on three-dimensional numerical model of the expander in ANSYS CFX software. The numerical model of the expander was validated using the data obtained from the experiment carried out on a lab test-stand. Then a series of computational analysis were performed using expanders' numerical model in order to determine its operating conditions under various flow conditions of different working fluids.

  17. Deep space network software cost estimation model

    Science.gov (United States)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  18. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  19. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  20. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  1. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  2. The Expanded Capabilities Of The Cementitious Barriers Partnership Software Toolbox Version 2.0 - 14331

    Energy Technology Data Exchange (ETDEWEB)

    Burns, Heather; Flach, Greg; Smith, Frank; Langton, Christine; Brown, Kevin; Kosson, David; Samson, Eric; Mallick, Pramod

    2014-01-10

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The CBP Software Toolbox – “Version 1.0” was released early in FY2013 and was used to support DOE-EM performance assessments in evaluating various degradation mechanisms that included sulfate attack, carbonation and constituent leaching. The sulfate attack analysis predicted the extent and damage that sulfate ingress will have on concrete vaults over extended time (i.e., > 1000 years) and the carbonation analysis provided concrete degradation predictions from rebar corrosion. The new release “Version 2.0” includes upgraded carbonation software and a new software module to evaluate degradation due to chloride attack. Also included in the newer version are a dual regime module allowing evaluation of contaminant release in two regimes – both fractured and un-fractured. The integrated software package has also been upgraded with new plotting capabilities and many other features that increase the “user-friendliness” of the package. Experimental work has been generated to provide data to calibrate the models to improve the credibility of the analysis and reduce the uncertainty. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox is and will continue to produce tangible benefits to the working DOE

  3. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system...

  4. Software Process Models and Analysis on Failure of Software Development Projects

    OpenAIRE

    Kaur, Rupinder; Sengupta, Jyotsna

    2013-01-01

    The software process model consists of a set of activities undertaken to design, develop and maintain software systems. A variety of software process models have been designed to structure, describe and prescribe the software development process. The software process models play a very important role in software development, so it forms the core of the software product. Software project failure is often devastating to an organization. Schedule slips, buggy releases and missing features can me...

  5. Evaluating software architecture using fuzzy formal models

    Directory of Open Access Journals (Sweden)

    Payman Behbahaninejad

    2012-04-01

    Full Text Available Unified Modeling Language (UML has been recognized as one of the most popular techniques to describe static and dynamic aspects of software systems. One of the primary issues in designing software packages is the existence of uncertainty associated with such models. Fuzzy-UML to describe software architecture has both static and dynamic perspective, simultaneously. The evaluation of software architecture design phase initiates always help us find some additional requirements, which helps reduce cost of design. In this paper, we use a fuzzy data model to describe the static aspects of software architecture and the fuzzy sequence diagram to illustrate the dynamic aspects of software architecture. We also transform these diagrams into Petri Nets and evaluate reliability of the architecture. The web-based hotel reservation system for further explanation has been studied.

  6. Model Checking Software Systems: A Case Study.

    Science.gov (United States)

    1995-03-10

    Model checking is a proven successful technology for verifying hardware. It works, however, on only fInite state machines, and most software systems...have infInitely many states. Our approach to applying model checking to software hinges on identifying appropriate abstractions that exploit the

  7. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  8. EPA's Benchmark Dose Modeling Software

    Science.gov (United States)

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  9. An Expanded Model of Distributed Leadership in Organizational Knowledge Creation

    OpenAIRE

    Cannatelli, B.; Smith, B. J.; Giudici, A.; Jones, J; Conger, M.

    2016-01-01

    Based on a three-year qualitative, longitudinal case study of a social venture partnership, we extend the understanding of distributed leadership in organizational knowledge creation. We develop an expanded model of distributed leadership that identifies the antecedents, different forms, and enablers of distributed leadership in knowledge creation. Our findings move beyond a static and monolithic understanding of distributed leadership to illustrate how an expanded model informs the situation...

  10. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  11. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  12. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    Science.gov (United States)

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  13. Expanding the Model of Organizational Learning: Scope, Contingencies, and Dynamics

    Directory of Open Access Journals (Sweden)

    Barbara Grah

    2016-05-01

    Full Text Available Our paper seeks to contribute to the understanding of organizational learning by (a integrating existing models of organizational learning into a single model and (b expanding the model to include inter-organizational learning, adding key contingencies suggested by the growing literature on neuroleadership, and incorporating a process dimension to reflect the fact that organizational learning is continuous and dynamic. The resulting expanded model of organizational learning encompasses four levels on which learning can occur: individual, team, organizational, and inter-organizational. The overall validity of the model is illustrated by applying it to two knowledge-intensive Slovenian firms. Implications for theory and practice are discussed.

  14. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  15. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  16. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  17. Fuzzy Cognitive Map Modelling Educational Software Adoption

    Science.gov (United States)

    Hossain, Sarmin; Brooks, Laurence

    2008-01-01

    Educational software adoption across UK secondary schools is seen as unsatisfactory. Based on stakeholders' perceptions, this paper uses fuzzy cognitive maps (FCMs) to model this adoption context. It discusses the development of the FCM model, using a mixed-methods approach and drawing on participants from three UK secondary schools. The study…

  18. Software Model Checking Without Source Code

    Science.gov (United States)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  19. Management models in the NZ software industry

    Directory of Open Access Journals (Sweden)

    Holger Spill

    Full Text Available This research interviewed eight innovative New Zealand software companies to find out how they manage new product development. It looked at how management used standard techniques of software development to manage product uncertainty through the theoretical lens of the Cyclic Innovation Model. The study found that while there is considerable variation, the management of innovation was largely determined by the level of complexity. Organizations with complex innovative software products had a more iterative software development style, more flexible internal processes and swifter decision-making. Organizations with less complexity in their products tended to use more formal structured approaches. Overall complexity could be inferred with reference to four key factors within the development environment.

  20. A software risk management capability model for medical device software

    OpenAIRE

    Burton, John

    2008-01-01

    peer-reviewed The Medical Device industry is currently one of the fastest growing industries in the world and a guarantee of the integrity of medical device software has become increasingly important. Failure of the software can have potentially catastrophic effects, leading to injury of patients or even death. Consequently there is a tremendous onus on medical device manufacturers to demonstrate that sufficient attention is devoted to the area of software risk management throughout the so...

  1. inventory management, VMI, software agents, MDV model

    Directory of Open Access Journals (Sweden)

    Waldemar Wieczerzycki

    2012-03-01

    Full Text Available Background: As it is well know, the implementation of instruments of logistics management is only possible with the use of the latest information technology. So-called agent technology is one of the most promising solutions in this area. Its essence consists in an entirely new way of software distribution on the computer network platform, in which computer exchange among themselves not only data, but also software modules, called just agents. The first aim is to propose the alternative method of the implementation of the concept of the inventory management by the supplier with the use of intelligent software agents, which are able not only to transfer the information but also to make the autonomous decisions based on the privileges given to them. The second aim of this research was to propose a new model of a software agent, which will be both of a high mobility and a high intelligence. Methods: After a brief discussion of the nature of agent technology, the most important benefits of using it to build platforms to support business are given. Then the original model of polymorphic software agent, called Multi-Dimensionally Versioned Software Agent (MDV is presented, which is oriented on the specificity of IT applications in business. MDV agent is polymorphic, which allows the transmission through the network only the most relevant parts of its code, and only when necessary. Consequently, the network nodes exchange small amounts of software code, which ensures high mobility of software agents, and thus highly efficient operation of IT platforms built on the proposed model. Next, the adaptation of MDV software agents to implementation of well-known logistics management instrument - VMI (Vendor Managed Inventory is illustrated. Results: The key benefits of this approach are identified, among which one can distinguish: reduced costs, higher flexibility and efficiency, new functionality - especially addressed to business negotiation, full automation

  2. Bioinactivation: Software for modelling dynamic microbial inactivation.

    Science.gov (United States)

    Garre, Alberto; Fernández, Pablo S; Lindqvist, Roland; Egea, Jose A

    2017-03-01

    This contribution presents the bioinactivation software, which implements functions for the modelling of isothermal and non-isothermal microbial inactivation. This software offers features such as user-friendliness, modelling of dynamic conditions, possibility to choose the fitting algorithm and generation of prediction intervals. The software is offered in two different formats: Bioinactivation core and Bioinactivation SE. Bioinactivation core is a package for the R programming language, which includes features for the generation of predictions and for the fitting of models to inactivation experiments using non-linear regression or a Markov Chain Monte Carlo algorithm (MCMC). The calculations are based on inactivation models common in academia and industry (Bigelow, Peleg, Mafart and Geeraerd). Bioinactivation SE supplies a user-friendly interface to selected functions of Bioinactivation core, namely the model fitting of non-isothermal experiments and the generation of prediction intervals. The capabilities of bioinactivation are presented in this paper through a case study, modelling the non-isothermal inactivation of Bacillus sporothermodurans. This study has provided a full characterization of the response of the bacteria to dynamic temperature conditions, including confidence intervals for the model parameters and a prediction interval of the survivor curve. We conclude that the MCMC algorithm produces a better characterization of the biological uncertainty and variability than non-linear regression. The bioinactivation software can be relevant to the food and pharmaceutical industry, as well as to regulatory agencies, as part of a (quantitative) microbial risk assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  4. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  5. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  6. Model Driven Software Development for Agricultural Robotics

    DEFF Research Database (Denmark)

    Larsen, Morten

    The design and development of agricultural robots, consists of both mechan- ical, electrical and software components. All these components must be de- signed and combined such that the overall goal of the robot is fulfilled. The design and development of these systems require collaboration between...... processing, control engineering, etc. This thesis proposes a Model-Driven Software Develop- ment based approach to model, analyse and partially generate the software implementation of a agricultural robot. Furthermore, Guidelines for mod- elling the architecture of an agricultural robots are provided......, assisting with bridging the different engineering disciplines. Timing play an important role in agricultural robotic applications, synchronisation of robot movement and implement actions is important in order to achieve precision spraying, me- chanical weeding, individual feeding, etc. Discovering...

  7. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  8. Expanding the Functional Assessment Model for Naturalistic Intervention Design.

    Science.gov (United States)

    Evans, Ian M.

    2000-01-01

    This article comments on a study that used functional assessment to reduce behavior problems in a child with multiple disabilities (Kern and Vorndran, 2000). It suggests additional principles need to be incorporated into an expanded model if functional assessment is to have a truly positive influence on naturalistic treatment planning. (Contains…

  9. Memoised Garbage Collection for Software Model Checking

    NARCIS (Netherlands)

    Nguyen, V.Y.; Ruys, T.C.; Kowalewski, S.; Philippou, A.

    Virtual machine based software model checkers like JPF and MoonWalker spend up to half of their veri��?cation time on garbage collection. This is no surprise as after nearly each transition the heap has to be cleaned from garbage. To improve this, this paper presents the Memoised Garbage Collection

  10. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  11. A Model of Foam Density Prediction for Expanded Perlite Composites

    OpenAIRE

    Arifuzzaman Md; Kim Ho Sung

    2015-01-01

    Multiple sets of variables associated with expanded perlite particle consolidation in foam manufacturing were analyzed to develop a model for predicting perlite foam density. The consolidation of perlite particles based on the flotation method and compaction involves numerous variables leading to the final perlite foam density. The variables include binder content, compaction ratio, perlite particle size, various perlite particle densities and porosities, and various volumes of perlite at dif...

  12. Artificial Intelligence Software Engineering (AISE) model

    Science.gov (United States)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  13. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  14. Applying reliability models to the maintenance of Space Shuttle software

    Science.gov (United States)

    Schneidewind, Norman F.

    1992-01-01

    Software reliability models provide the software manager with a powerful tool for predicting, controlling, and assessing the reliability of software during maintenance. We show how a reliability model can be effectively employed for reliability prediction and the development of maintenance strategies using the Space Shuttle Primary Avionics Software Subsystem as an example.

  15. Software cost/resource modeling: Deep space network software cost estimation model

    Science.gov (United States)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  16. Saphire models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1997-02-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.

  17. An expanded model of faculty vitality in academic medicine.

    Science.gov (United States)

    Dankoski, Mary E; Palmer, Megan M; Nelson Laird, Thomas F; Ribera, Amy K; Bogdewic, Stephen P

    2012-12-01

    Many faculty in today's academic medical centers face high levels of stress and low career satisfaction. Understanding faculty vitality is critically important for the health of our academic medical centers, yet the concept is ill-defined and lacking a comprehensive model. Expanding on previous research that examines vital faculty in higher education broadly and in academic medical centers specifically, this study proposes an expanded model of the unique factors that contribute to faculty vitality in academic medicine. We developed an online survey on the basis of a conceptual model (N = 564) and used linear regression to investigate the fit of the model. We examined the relationships of two predictor variables measuring Primary Unit Climate and Leadership and Career and Life Management with an overall Faculty Vitality index comprised of three measures: Professional Engagement, Career Satisfaction, and Productivity. The findings revealed significant predictive relationships between Primary Unit Climate and Leadership, Career and Life Management, and Faculty Vitality. The overall model accounted for 59% of the variance in the overall Faculty Vitality Index. The results provide new insights into the developing model of faculty vitality and inform initiatives to support faculty in academic medical centers. Given the immense challenges faced by faculty, now more than ever do we need reliable evidence regarding what sustains faculty vitality.

  18. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  19. Support for an expanded tripartite influence model with gay men.

    Science.gov (United States)

    Tylka, Tracy L; Andorka, Michael J

    2012-01-01

    This study investigated whether an expanded tripartite influence model would represent gay men's experiences. This model was extended by adding partners and gay community involvement as sources of social influence and considering dual body image pathways (muscularity and body fat dissatisfaction) to muscularity enhancement and disordered eating behaviors. Latent variable structural equation modeling analyses upheld this model for 346 gay men. Dual body image pathways to body change behaviors were supported, although three unanticipated interrelationships emerged, suggesting that muscularity and body fat concerns and behaviors may be more integrated for gay men. Internalization of the mesomorphic ideal, appearance comparison, muscularity dissatisfaction, and body fat dissatisfaction were key mediators in the model. Of the sources of social influence, friend and media pressure to be lean, gay community involvement, and partner, friend, media, and family pressures to be muscular made incremental contributions. Unexpectedly, certain sources were directly connected to body change behaviors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    Science.gov (United States)

    Blaurock, Carl

    2009-01-01

    The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral

  1. Next-Generation Lightweight Mirror Modeling Software

    Science.gov (United States)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, Phil

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models possible

  2. Next Generation Lightweight Mirror Modeling Software

    Science.gov (United States)

    Arnold, William R., Sr.; Fitzgerald, Mathew; Rosa, Rubin Jaca; Stahl, H. Philip

    2013-01-01

    The advances in manufacturing techniques for lightweight mirrors, such as EXELSIS deep core low temperature fusion, Corning's continued improvements in the Frit bonding process and the ability to cast large complex designs, combined with water-jet and conventional diamond machining of glasses and ceramics has created the need for more efficient means of generating finite element models of these structures. Traditional methods of assembling 400,000 + element models can take weeks of effort, severely limiting the range of possible optimization variables. This paper will introduce model generation software developed under NASA sponsorship for the design of both terrestrial and space based mirrors. The software deals with any current mirror manufacturing technique, single substrates, multiple arrays of substrates, as well as the ability to merge submodels into a single large model. The modeler generates both mirror and suspension system elements, suspensions can be created either for each individual petal or the whole mirror. A typical model generation of 250,000 nodes and 450,000 elements only takes 5-10 minutes, much of that time being variable input time. The program can create input decks for ANSYS, ABAQUS and NASTRAN. An archive/retrieval system permits creation of complete trade studies, varying cell size, depth, and petal size, suspension geometry with the ability to recall a particular set of parameters and make small or large changes with ease. The input decks created by the modeler are text files which can be modified by any editor, all the key shell thickness parameters are accessible and comments in deck identify which groups of elements are associated with these parameters. This again makes optimization easier. With ANSYS decks, the nodes representing support attachments are grouped into components; in ABAQUS these are SETS and in NASTRAN as GRIDPOINT SETS, this make integration of these models into large telescope or satellite models easier.

  3. Automated Environment Generation for Software Model Checking

    Science.gov (United States)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  4. Graphical modelling software in R - status

    DEFF Research Database (Denmark)

    Dethlefsen, Claus; Højsgaard, Søren; Lauritzen, Steffen L.

    Graphical models in their modern form have been around for nearly a quarter of a century.  Various computer programs for inference in graphical models have been developed over that period. Some examples of free software programs are BUGS (Thomas 1994), CoCo (Badsberg2001), Digram (Klein, Keiding......, and Kreiner 1995), MIM (Edwards  2000), and Tetrad (Glymour, Scheines, Spirtes, and Kelley 1987). The gR initiative (Lauritzen 2002) aims at making graphical models available in R (R Development Core Team 2006). A small grant from the Danish Science Foundation supported this initiative. We will summarize...... the results of the initiative so far. Specifically we will illustrate some of the R packages for graphical modelling currently on CRAN and discuss their strengths and weaknesses....

  5. Process model for building quality software on internet time ...

    African Journals Online (AJOL)

    The competitive nature of the software construction market and the inherently exhilarating nature of software itself have hinged the success of any software development project on four major pillars: time to market, product quality, innovation and documentation. Unfortunately, however, existing software development models ...

  6. Hubble Diagram Test of Expanding and Static Cosmological Models: The Case for a Slowly Expanding Flat Universe

    Directory of Open Access Journals (Sweden)

    Laszlo A. Marosi

    2013-01-01

    Full Text Available We present a new redshift (RS versus photon travel time ( test including 171 supernovae RS data points. We extended the Hubble diagram to a range of z = 0,0141–8.1 in the hope that at high RSs, the fitting of the calculated RS/ diagrams to the observed RS data would, as predicted by different cosmological models, set constraints on alternative cosmological models. The Lambda cold dark matter (ΛCDM, the static universe model, and the case for a slowly expanding flat universe (SEU are considered. We show that on the basis of the Hubble diagram test, the static and the slowly expanding models are favored.

  7. Supporting customized failure models for distributed software

    Science.gov (United States)

    Hiltunen, Matti A.; Immanuel, Vijaykumar; Schlichting, Richard D.

    1999-09-01

    The cost of employing software fault tolerance techniques in distributed systems is strongly related to the type of failures to be tolerated. For example, in terms of the amount of redundancy required and execution time, tolerating a processor crash is much cheaper than tolerating arbitrary (or Byzantine) failures. This paper describes an approach to constructing configurable services for distributed systems that allows easy customization of the type of failures to tolerate. Using this approach, it is possible to configure custom services across a spectrum of possibilities, from a very efficient but unreliable server group that does not tolerate any failures, to a less efficient but reliable group that tolerates crash, omission, timing, or arbitrary failures. The approach is based on building configurable services as collections of software modules called micro-protocols. Each micro-protocol implements a different semantic property or property variant, and interacts with other micro-protocols using an event-driven model provided by a runtime system. In addition to facilitating the choice of failure model, the approach allows service properties such as message ordering and delivery atomicity to be customized for each application.

  8. A Model of Foam Density Prediction for Expanded Perlite Composites

    Directory of Open Access Journals (Sweden)

    Arifuzzaman Md

    2015-01-01

    Full Text Available Multiple sets of variables associated with expanded perlite particle consolidation in foam manufacturing were analyzed to develop a model for predicting perlite foam density. The consolidation of perlite particles based on the flotation method and compaction involves numerous variables leading to the final perlite foam density. The variables include binder content, compaction ratio, perlite particle size, various perlite particle densities and porosities, and various volumes of perlite at different stages of process. The developed model was found to be useful not only for prediction of foam density but also for optimization between compaction ratio and binder content to achieve a desired density. Experimental verification was conducted using a range of foam densities (0.15 – 0.5 g/cm3 produced with a range of compaction ratios (1.5 – 3.5, a range of sodium silicate contents (0.05 – 0.35 g/ml in dilution, a range of expanded perlite particle sizes (1 – 4 mm, and various perlite densities (such as skeletal, material, bulk, and envelope densities. A close agreement between predictions and experimental results was found.

  9. Evaluating Educational Software Authoring Environments Using a Model Based on Software Engineering and Instructional Design Principles.

    Science.gov (United States)

    Collis, Betty A.; Gore, Marilyn

    1987-01-01

    This study suggests a new model for the evaluation of educational software authoring systems and applies this model to a particular authoring system, CSR Trainer 4000. The model used is based on an integrated set of software engineering and instructional design principles. (Author/LRW)

  10. A Model for Joint Software Reviews

    Science.gov (United States)

    1998-10-01

    DSTO-TR-0735 1. Introduction Joint software reviews form an important part of the Defence Acquisition Process [MIL-STD-1521B, 1985; MIL-STD-498, 1994...This section provides an introduction to software reviews by comparing inspections and joint software reviews. Joint software reviews were chosen for...Decision Making ( MCDM ) system and discusses the conflict between the objectives [Ulkucu, 1989]. There are several classes of system objectives, and conflict

  11. Model requirements for Biobank Software Systems.

    Science.gov (United States)

    Tukacs, Edit; Korotij, Agnes; Maros-Szabo, Zsuzsanna; Molnar, Agnes Marta; Hajdu, Andras; Torok, Zsolt

    2012-01-01

    Biobanks are essential tools in diagnostics and therapeutics research and development related to personalized medicine. Several international recommendations, standards and guidelines exist that discuss the legal, ethical, technological, and management requirements of biobanks. Today's biobanks are much more than just collections of biospecimens. They also store a huge amount of data related to biological samples which can be either clinical data or data coming from biochemical experiments. A well-designed biobank software system also provides the possibility of finding associations between stored elements. Modern research biobanks are able to manage multicenter sample collections while fulfilling all requirements of data protection and security. While developing several biobanks and analyzing the data stored in them, our research group recognized the need for a well-organized, easy-to-check requirements guideline that can be used to develop biobank software systems. International best practices along with relevant ICT standards were integrated into a comprehensive guideline: The Model Requirements for the Management of Biological Repositories (BioReq), which covers the full range of activities related to biobank development. The guideline is freely available on the Internet for the research community. The database is available for free at http://bioreq.astridbio.com/bioreq_v2.0.pdf.

  12. Radiobiological modeling with MarCell software

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, J.S.; Jones, T.D. [Oak Ridge National Lab., TN (United States). Health Sciences Research Div.

    1999-01-01

    A nonlinear system of differential equations that models the bone marrow cellular kinetics associated with radiation injury, molecular repair, and compensatory cell proliferation has been extensively documented. Recently, that model has been implemented as MarCell, a user-friendly MS-DOS computer program that allows users with little knowledge of the original model to evaluate complex radiation exposure scenarios. The software allows modeling with the following radiations: tritium beta, 100 kVp X, 250 kVp X, 22 MV X, {sup 60}Co, {sup 137}Cs, 2 MeV electrons, triga neutrons, D-T neutrons, and 3 blends of mixed-field fission radiations. The possible cell lineages are stem, stroma, and leukemia/lymphoma, and the available species include mouse, rat, dog, sheep, swine, burro, and man. An attractive mathematical feature is that any protracted protocol can be expressed as an equivalent prompt dose for either the source used or for a reference, such as 250 kVp X rays or {sup 60}Co. Output from MarCell includes: risk of 30-day mortality; risk of cancer and leukemia based either on cytopenia or compensatory cell proliferation; cell survival plots as a function of time or dose; and 4-week recovery kinetics following treatment. In this article, the program`s applicability and ease of use are demonstrated by evaluating a medical total body irradiation protocol and a nuclear fallout scenario.

  13. Expanding the toolkit for membrane protein modeling in Rosetta.

    Science.gov (United States)

    Koehler Leman, Julia; Mueller, Benjamin K; Gray, Jeffrey J

    2017-03-01

    A range of membrane protein modeling tools has been developed in the past 5-10 years, yet few of these tools are integrated and make use of existing functionality for soluble proteins. To extend existing methods in the Rosetta biomolecular modeling suite for membrane proteins, we recently implemented RosettaMP, a general framework for membrane protein modeling. While RosettaMP facilitates implementation of new methods, addressing real-world biological problems also requires a set of accessory tools that are used to carry out standard modeling tasks. Here, we present six modeling tools, including de novo prediction of single trans-membrane helices, making mutations and refining the structure with different amounts of flexibility, transforming a protein into membrane coordinates and optimizing its embedding, computing a Rosetta energy score, and visualizing the protein in the membrane bilayer. We present these methods with complete protocol captures that allow non-expert modelers to carry out the computations. The presented tools are part of the Rosetta software suite, available at www.rosettacommons.org . julia.koehler.leman@gmail.com. Supplementary data are available at Bioinformatics online.

  14. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  15. Modeling of Plasma Irregularities in Expanding Ionospheric Dust Clouds

    Science.gov (United States)

    Fu, H.; Scales, W.; Mahmoudian, A.; Bordikar, M. R.

    2009-12-01

    Natural dust layers occur in the earth’s mesosphere (50km-85km). Plasma irregularities are associated with these natural dust layers that produce radar echoes. Recently, an Ionospheric sounding rocket experiment was performed to investigate the plasma irregularities in upper atmospheric dust layers. The Charged Aerosol Release Experiment (CARE) uses a rocket payload injection of particles in the ionosphere to determine the mechanisms for enhanced radar scatter from plasma irregularities embedded in artificial dusty plasma in space. A 2-D hybrid computational model is described that may be used to study a variety of irregularities in dusty space plasmas which may lead to radar echoes. In this model, the dust and ions are both treated with Particle-In-Cell method while the dust charge varies with time based on the standard dust Orbit Motion Limited charging model. A stochastic model is adopted to remove particle ions due to the dust charging process. Electrons are treated with a fluid model including the parallel dynamics of magnetic fields. Fourier spectral methods with a predictor-corrector time advance are used to solve it. This numerical model will be used to investigate the electrodynamics and several possible plasma irregularity generation mechanisms after the creation of an artificial dust layer. The first is the dust ion-acoustic instability due to the drift of dust relative to the plasma. The instability saturates by trapping some ions. The effects of dust radius and dust drift velocity on plasma irregularities will be analyzed further. Also, a shear- driven instability in expanding dusty clouds is investigated.

  16. Traceability for Model Driven, Software Product Line Engineering

    NARCIS (Netherlands)

    Anquetil, N.; Grammel, B.; Galvao, I.; Noppen, J.A.R.; Shakil Khan, S.; Arboleda, H.; Rashid, A.; Garcia, A.

    Traceability is an important challenge for software organizations. This is true for traditional software development and even more so in new approaches that introduce more variety of artefacts such as Model Driven development or Software Product Lines. In this paper we look at some aspect of the

  17. COVAMOF : A framework for modeling variability in software product families

    NARCIS (Netherlands)

    Sinnema, M; Deelstra, S; Nijhuis, J; Bosch, J; Nord, RL

    2004-01-01

    A key aspect of variability management in software product families is the explicit representation of the variability. Experiences at several industrial software development companies have shown that a software variability model should do four things: (1) uniformly represent variation points as

  18. An improved COCOMO software cost estimation model | Duke ...

    African Journals Online (AJOL)

    In this paper, we discuss the methodologies adopted previously in software cost estimation using the COnstructive COst MOdels (COCOMOs). From our analysis, COCOMOs produce very high software development efforts, which eventually produce high software development costs. Consequently, we propose its extension, ...

  19. Model-driven specification of software services

    NARCIS (Netherlands)

    Shishkov, Boris; van Sinderen, Marten J.; Tekinerdogan, B.

    2007-01-01

    Aligning adequately business requirements and software functionality as well as achieving ‘loose coupling’ for service functionalities, are identified challenges relevant to service-oriented software design. Furthering previous related work, we propose in this paper an application design process

  20. The purely functional software deployment model

    NARCIS (Netherlands)

    Dolstra, E.

    2006-01-01

    Software deployment is the set of activities related to getting software components to work on the machines of end users. It includes activities such as installation, upgrading, uninstallation, and so on. Many tools have been developed to support deployment, but they all have serious

  1. Software Engineering with Process Algebra: Modelling Client / Server Architectures

    OpenAIRE

    Diertens, B.

    2009-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the tools we use in this process in a CASE setting, leading to the PSF-ToolBus software engineering environment. In this article we summarize that work and describe a similar software development process ...

  2. A multi-layered software architecture model for building software solutions in an urbanized information system

    Directory of Open Access Journals (Sweden)

    Sana Guetat

    2013-01-01

    Full Text Available The concept of Information Systems urbanization has been proposed since the late 1990’s in order to help organizations building agile information systems. Nevertheless, despite the advantages of this concept, it remains too descriptive and presents many weaknesses. In particular, there is a lack of useful architecture models dedicated to defining software solutions compliant with information systems urbanization principles and rules. Moreover, well-known software architecture models do not provide sufficient resources to address the requirements and constraints of urbanized information systems. In this paper, we draw on the “information city” framework to propose a model of software architecture - called the 5+1 Software Architecture Model - which is compliant with information systems urbanization principles and helps organizations building urbanized software solutions. This framework improves the well-established software architecture models and allows the integration of new architectural paradigms. Furthermore, the proposed model contributes to the implementation of information systems urbanization in several ways. On the one hand, this model devotes a specific layer to applications integration and software reuse. On the other hand, it contributes to the information system agility and scalability due to its conformity to the separation of concerns principle.

  3. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical......, communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  4. Model-driven and software product line engineering

    CERN Document Server

    Royer, Jean-Claude

    2013-01-01

    Many approaches to creating Software Product Lines have emerged that are based on Model-Driven Engineering. This book introduces both Software Product Lines and Model-Driven Engineering, which have separate success stories in industry, and focuses on the practical combination of them. It describes the challenges and benefits of merging these two software development trends and provides the reader with a novel approach and practical mechanisms to improve software development productivity.The book is aimed at engineers and students who wish to understand and apply software product lines

  5. Models and metrics for software management and engineering

    Science.gov (United States)

    Basili, V. R.

    1988-01-01

    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.

  6. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  7. dMODELS: A software package for modeling volcanic deformation

    Science.gov (United States)

    Battaglia, Maurizio

    2017-04-01

    dMODELS is a software package that includes the most common source models used to interpret deformation measurements near active volcanic centers. The emphasis is on estimating the parameters of analytical models of deformation by inverting data from the Global Positioning System (GPS), Interferometric Synthetic Aperture Radar (InSAR), tiltmeters and strainmeters. Source models include: (a) pressurized spherical, ellipsoidal and sill-like magma chambers in an elastic, homogeneous, flat half-space; (b) pressurized spherical magma chambers with topography corrections; and (c) the solutions for a dislocation (fracture) in an elastic, homogeneous, flat half-space. All of the equations have been extended to include deformation and strain within the Earth's crust (as opposed to only at the Earth's surface) and verified against finite element models. Although actual volcanic sources are not embedded cavities of simple shape, we assume that these models may reproduce the stress field created by the actual magma intrusion or hydrothermal fluid injection. The dMODELS software employs a nonlinear inversion algorithm to determine the best-fit parameters for the deformation source by searching for the minimum of the cost function χv2 (chi square per degrees of freedom). The non-linear inversion algorithm is a combination of local optimization (interior-point method) and random search. This approach is more efficient for hyper-parameter optimization than trials on a grid. The software has been developed using MATLAB, but compiled versions that can be run using the free MATLAB Compiler Runtime (MCR) module are available for Windows 64-bit operating systems. The MATLAB scripts and compiled files are open source and intended for teaching and research. The software package includes both functions for forward modeling and scripts for data inversion. A software demonstration will be available during the meeting. You are welcome to contact the author at mbattaglia@usgs.gov for

  8. Model Averaging Software for Dichotomous Dose Response Risk Estimation

    Directory of Open Access Journals (Sweden)

    Matthew W. Wheeler

    2008-02-01

    Full Text Available Model averaging has been shown to be a useful method for incorporating model uncertainty in quantitative risk estimation. In certain circumstances this technique is computationally complex, requiring sophisticated software to carry out the computation. We introduce software that implements model averaging for risk assessment based upon dichotomous dose-response data. This software, which we call Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD, fits the quantal response models, which are also used in the US Environmental Protection Agency benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates. The software fulfills a need for risk assessors, allowing them to go beyond one single model in their risk assessments based on quantal data by focusing on a set of models that describes the experimental data.

  9. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  10. Extracting software static defect models using data mining

    Directory of Open Access Journals (Sweden)

    Ahmed H. Yousef

    2015-03-01

    Full Text Available Large software projects are subject to quality risks of having defective modules that will cause failures during the software execution. Several software repositories contain source code of large projects that are composed of many modules. These software repositories include data for the software metrics of these modules and the defective state of each module. In this paper, a data mining approach is used to show the attributes that predict the defective state of software modules. Software solution architecture is proposed to convert the extracted knowledge into data mining models that can be integrated with the current software project metrics and bugs data in order to enhance the prediction. The results show better prediction capabilities when all the algorithms are combined using weighted votes. When only one individual algorithm is used, Naïve Bayes algorithm has the best results, then the Neural Network and the Decision Trees algorithms.

  11. An algebraic approach to modeling in software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Loegel, G.J. [Superconducting Super Collider Lab., Dallas, TX (United States)]|[Michigan Univ., Ann Arbor, MI (United States); Ravishankar, C.V. [Michigan Univ., Ann Arbor, MI (United States)

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ``computer science`` objects like abstract data types, but in practice software errors are often caused because ``real-world`` objects are improperly modeled. There is a large semantic gap between the customer`s objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form.

  12. A Work Psychological Model that Works: Expanding the Job Demands-Resources Model

    NARCIS (Netherlands)

    Xanthopoulou, D.

    2007-01-01

    The main purpose of the current thesis was to test and expand the recently developed Job Demands-Resources (JD-R) model. The advantage of this model is that it recognizes the uniqueness of each work environment, which has its own specific job demands and job resources. According to the JD-R model,

  13. Beyond Reactive Planning: Self Adaptive Software and Self Modeling Software in Predictive Deliberation Management

    National Research Council Canada - National Science Library

    Lenahan, Jack; Nash, Michael P; Charles, Phil

    2008-01-01

    .... We present the following hypothesis: predictive deliberation management using self-adapting and self-modeling software will be required to provide mission planning adjustments after the start of a mission...

  14. [Numerical modeling of shape memory alloy vascular stent's self-expandable progress and "optimized grid" of stent].

    Science.gov (United States)

    Xu, Qiang; Liu, Yulan; Wang, Biao; He, Jin

    2008-10-01

    Vascular stent is an important medical appliance for angiocardiopathy. Its key deformation process is the expandable progress of stent in the vessel. The important deformation behaviour corresponds to two mechanics targets: deformation and stress. This paper is devoted to the research and development of vascular stent with proprietary intellectual property rights. The design of NiTinol self-expandable stent is optimized by means of finite element software. ANSYS is used to build the finite element simulation model of vascular stent; the molding material is NiTinol shape memory alloy. To cope with the factors that affect the structure of stent, the shape of grid and so on, the self-expanding process of Nitinol stent is simulated through computer. By making a comparison between two kinds of stents with similar grid structure, we present a new concept of "Optimized Grid" of stent.

  15. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  16. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  17. Traditions of the Sun, One Model for Expanding Audience Access

    Science.gov (United States)

    Hawkins, I.; Paglierani, R.

    2006-12-01

    The Internet is a powerful tool with which to expand audience access, bringing students, teachers and the public to places and resources they might not otherwise visit or make use of. We will present Traditions of the Sun, an experiential Web site that invites exploration of the world's ancient observatories with special emphasis on Chaco Culture National Historic Park in the Four Corners region of the US and several sites in the Yucatan Peninsula in Mexico. Traditions of the Sun includes resources in English and Spanish along with a unique trilingual on-line book, "Traditions of the Sun, A Photographic Journal," containing explanatory text in Yucatec Maya as well. Traditions of the Sun offers rich opportunities for virtual visits to ancient sites used for solar observing while learning about current NASA research on the Sun and indigenous solar practices within a larger historical and cultural context. The site contains hundreds of photographs, historic images and rich multimedia to help tell the story of the Sun-Earth Connection. Visitors to the site can zoom in on the great Mayan cities of Chichen Itza, Uxmal, Dzibilchaltun, and Mayapan to learn about Mayan astronomy, history, culture, and science. They can also visit Chaco Canyon to watch sunrise over Pueblo Bonito on the summer solstice, take a virtual reality tour of the great kiva at Casa Rinconada or see panoramic vistas from Fajada Butte, an area which, for preservation purposes, is restricted to the public. Traditions of the Sun provides one model of how exploration and discovery can come to life for both formal and informal audiences via the Internet. Traditions of the Sun is a collaborative project between NASA's Sun-Earth Connection Education Forum, the National Park Service, Instituto National de Antropologia e Historia, Universidad Nacional Autonoma de Mexico, and Ideum.

  18. A Case Study in Model Checking Software Systems.

    Science.gov (United States)

    1996-04-01

    Model checking is a proven successful technology for verifying hardware. It works, however, on only finite state machines, and most software systems...have infinitely many states. Our approach to applying model checking to software hinges on identifying appropriate abstractions that exploit the

  19. Development of an Environment for Software Reliability Model Selection

    Science.gov (United States)

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  20. Software Modeling Studies. Volume 1. Summary of Technical Progress

    Science.gov (United States)

    1978-01-01

    of software modeling problems, and.in addition to develop models that truly reflect the actual software development process, and thus provide more...X - - Students Kenneth Apperley - - X - Marek Babinski X X - - Eli Berlinger - - X X John Casey - - X - Daniel Kaufman - - X - Ronald Karam - - X

  1. Validation and Verification of LADEE Models and Software

    Science.gov (United States)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  2. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    and verify protocol software, but limited work exists on using CPN models of protocols as a basis for automated code generation. The contribution of this paper is a method for generating protocol software from a class of CPN models annotated with code generation pragmatics. Our code generation method...

  3. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  4. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    Science.gov (United States)

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  5. Software for Mathematical Modeling of Plastic Deformation in FCC Metals

    Science.gov (United States)

    Petelin, A. E.; Eliseev, A. S.

    2017-08-01

    The question on the necessity of software implementation in the study of plastic deformation in FCC metals with the use of mathematical modeling methods is investigated. This article describes the implementation features and the possibility of using the software Dislocation Dynamics of Crystallographic Slip (DDCS). The software has an advanced user interface and is designed for users without an extensive experience in IT-technologies. Parameter values of the mathematical model, obtained from field experiments and accumulated in a special database, are used in DDCS to carry out computational experiments. Moreover, the software is capable of accumulating bibliographic information used in research.

  6. Expanding the Four Resources Model: Reading Visual and Multi-Modal Texts

    Science.gov (United States)

    Serafini, Frank

    2012-01-01

    Freebody and Luke proffered an expanded conceptualization of the resources readers utilize when reading and the roles readers adopt during the act of reading. The four resources model, and its associated four roles of the reader, expanded the definition of reading from a simple model of decoding printed texts to a model of constructing meaning and…

  7. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  8. A bridge role metric model for nodes in software networks.

    Directory of Open Access Journals (Sweden)

    Bo Li

    Full Text Available A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  9. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  10. Software engineering with process algebra: Modelling client / server architecures

    NARCIS (Netherlands)

    Diertens, B.

    2009-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the

  11. Aligning the economic modeling of software reuse with reuse practices

    NARCIS (Netherlands)

    Postmus, D.; Meijler, 27696

    In contrast to current practices where software reuse is applied recursively and reusable assets are tailored trough parameterization or specialization, existing reuse economic models assume that (i) the cost of reusing a software asset depends on its size and (ii) reusable assets are developed from

  12. Software process assessment using multiple process assessment models

    OpenAIRE

    Peldžius, Stasys

    2014-01-01

    Many software companies face such problems as projects being behind schedule, exceeding the budget, customer dissatisfaction with product quality. Most of the problems arise due to immature software process of the company. The most popular process assessment models worldwide are ISO/IEC 15504 and CMMI. Companies seeking wider official recognition choose between these two models. Companies face the problem that different customers require process assessment according to different models....

  13. A Study On Traditional And Evolutionary Software Development Models

    Directory of Open Access Journals (Sweden)

    Kamran Rasheed

    2017-07-01

    Full Text Available Today Computing technologies are becoming the pioneers of the organizations and helpful in individual functionality i.e. added to computing device we need to add softwares. Set of instruction or computer program is known as software. The development of software is done through some traditional or some new or evolutionary models. Software development is becoming a key and a successful business nowadays. Without software all hardware is useless. Some collective steps that are performed in the development of these are known as Software development life cycle SDLC. There are some adaptive and predictive models for developing software. Predictive mean already known like WATERFALL Spiral Prototype and V-shaped models while Adaptive model include agile Scrum. All methodologies of both adaptive and predictive have their own procedure and steps. Predictive are Static and Adaptive are dynamic mean change cannot be made to the predictive while adaptive have the capability of changing. The purpose of this study is to get familiar with all these and discuss their uses and steps of development. This discussion will be helpful in deciding which model they should use in which circumstance and what are the development step including in each model.

  14. Transformation of UML Behavioral Diagrams to Support Software Model Checking

    Directory of Open Access Journals (Sweden)

    Luciana Brasil Rebelo dos Santos

    2014-04-01

    Full Text Available Unified Modeling Language (UML is currently accepted as the standard for modeling (object-oriented software, and its use is increasing in the aerospace industry. Verification and Validation of complex software developed according to UML is not trivial due to complexity of the software itself, and the several different UML models/diagrams that can be used to model behavior and structure of the software. This paper presents an approach to transform up to three different UML behavioral diagrams (sequence, behavioral state machines, and activity into a single Transition System to support Model Checking of software developed in accordance with UML. In our approach, properties are formalized based on use case descriptions. The transformation is done for the NuSMV model checker, but we see the possibility in using other model checkers, such as SPIN. The main contribution of our work is the transformation of a non-formal language (UML to a formal language (language of the NuSMV model checker towards a greater adoption in practice of formal methods in software development.

  15. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  16. A Reference Model for Mobile Social Software for Learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2007-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model for mobile social software for learning. International Journal of Continuing Engineering Education and Life-Long Learning, 18(1), 118-138.

  17. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  18. Model-Driven Software Evolution : A Research Agenda

    NARCIS (Netherlands)

    Van Deursen, A.; Visser, E.; Warmer, J.

    2007-01-01

    Software systems need to evolve, and systems built using model-driven approaches are no exception. What complicates model-driven engineering is that it requires multiple dimensions of evolution. In regular evolution, the modeling language is used to make the changes. In meta-model evolution, changes

  19. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  20. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  1. A New Software Quality Model for Evaluating COTS Components

    OpenAIRE

    Adnan Rawashdeh; Bassem Matalkah

    2006-01-01

    Studies show that COTS-based (Commercial off the shelf) systems that are being built recently are exceeding 40% of the total developed software systems. Therefore, a model that ensures quality characteristics of such systems becomes a necessity. Among the most critical processes in COTS-based systems are the evaluation and selection of the COTS components. There are several existing quality models used to evaluate software systems in general; however, none of them is dedicated to COTS-based s...

  2. Object Oriented Modeling : A method for combining model and software development

    NARCIS (Netherlands)

    Van Lelyveld, W.

    2010-01-01

    When requirements for a new model cannot be met by available modeling software, new software can be developed for a specific model. Methods for the development of both model and software exist, but a method for combined development has not been found. A compatible way of thinking is required to

  3. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent...... to specify realizations of decisions in design models. This way, recurring realization work of design decisions can be automated. Since the concepts are generic and not bound to design decisions, other recurring work on models can be automated as well, for instance, design patterns and refactorings...

  4. Composable Framework Support for Software-FMEA Through Model Execution

    Science.gov (United States)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  5. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  6. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  7. Aspect-Oriented Model-Driven Software Product Line Engineering

    Science.gov (United States)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  8. Use and application of MADYMO 5.3 foam material model for expanded polypropylene foam

    NARCIS (Netherlands)

    Kant, A.R.; Suffis, B.; Lüsebrink, H.

    1998-01-01

    The dynamic material characteristics of expanded polypropylene are discussed. The in-depth studies, carried out by JSP International, in cooperation with TNO, are used to validate the MADYMO foam material model. The dynamic compression of expanded polypropylene follows a highly non-linear

  9. Business Model Exploration for Software Defined Networks

    NARCIS (Netherlands)

    Xu, Yudi; Jansen, Slinger|info:eu-repo/dai/nl/270902902; España, Sergio|info:eu-repo/dai/nl/412500949; Zhang, Dong; Gao, Xuesong

    2017-01-01

    Business modeling is becoming a foundational process in the information technology industry. Many ICT companies are constructing their business models to stay competitive on the cutting edge of the technology world. However, when comes to new technologies or emerging markets, it remains difficult

  10. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  11. The STAMP Software for State Space Models

    Directory of Open Access Journals (Sweden)

    Roy Mendelssohn

    2011-05-01

    Full Text Available This paper reviews the use of STAMP (Structural Time Series Analyser, Modeler and Predictor for modeling time series data using state-space methods with unobserved components. STAMP is a commercial, GUI-based program that runs on Windows, Linux and Macintosh computers as part of the larger OxMetrics System. STAMP can estimate a wide-variety of both univariate and multivariate state-space models, provides a wide array of diagnostics, and has a batch mode capability. The use of STAMP is illustrated for the Nile river data which is analyzed throughout this issue, as well as by modeling a variety of oceanographic and climate related data sets. The analyses of the oceanographic and climate data illustrate the breadth of models available in STAMP, and that state-space methods produce results that provide new insights into important scientific problems.

  12. Building a Flexible Software Factory Using Partial Domain Specific Models

    NARCIS (Netherlands)

    Warmer, J.B.; Kleppe, A.G.

    2006-01-01

    This paper describes some experiences in building a software factory by defining multiple small domain specific languages (DSLs) and having multiple small models per DSL. This is in high contrast with traditional approaches using monolithic models, e.g. written in UML. In our approach, models behave

  13. Hardware Ports - Getting Rid of Sandboxed Modelled Software

    NARCIS (Netherlands)

    Bezemer, M.M.; Welch, P.H.; Barnes, F.R.M.; Broenink, Johannes F.; Chalmers, K.; Gibson-Robinson, T.; Ivimey-Cook, R.; McEwan, A.A.; Pedersen, J.B.; Sampson, A,; Smith, M.L.

    2014-01-01

    Software that is used to control machines and robots must be predictable and reliable. Model-Driven Design (MDD) techniques are used to comply with both the technical and business needs. This paper introduces a CSP meta-model that is suitable for these MDD techniques. The meta-model describes the

  14. Introduction to Financial Projection Models. Business Management Instructional Software.

    Science.gov (United States)

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  15. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2014-01-01

    Highly recommended by JASA, Technometrics, and other journals, the first edition of this bestseller showed how to easily perform complex linear mixed model (LMM) analyses via a variety of software programs. Linear Mixed Models: A Practical Guide Using Statistical Software, Second Edition continues to lead readers step by step through the process of fitting LMMs. This second edition covers additional topics on the application of LMMs that are valuable for data analysts in all fields. It also updates the case studies using the latest versions of the software procedures and provides up-to-date information on the options and features of the software procedures available for fitting LMMs in SAS, SPSS, Stata, R/S-plus, and HLM.New to the Second Edition A new chapter on models with crossed random effects that uses a case study to illustrate software procedures capable of fitting these models Power analysis methods for longitudinal and clustered study designs, including software options for power analyses and suggest...

  16. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  17. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  18. An Evaluation of Software Cost Estimating Models.

    Science.gov (United States)

    1981-06-01

    and Model Outputs - Wolverton 5-15 Table 7 4s a summary comparison of the model outputs with the needs described in Sectin 3. A liberal interpretation...Computer Program Development Costs, Tecolote Research, Inc., TM-7, Dec. 1974. A-82 LUJO = 4j 1o -(AI V; L/- c- - I Iun 00. Ln uI 0e V.0 / I) 0o I- t

  19. The option to expand a project: its assessment with the binomial options pricing model

    Directory of Open Access Journals (Sweden)

    Salvador Cruz Rambaud

    Full Text Available Traditional methods of investment appraisal, like the Net Present Value, are not able to include the value of the operational flexibility of the project. In this paper, real options, and more specifically the option to expand, are assumed to be included in the project information in addition to the expected cash flows. Thus, to calculate the total value of the project, we are going to apply the methodology of the Net Present Value to the different scenarios derived from the existence of the real option to expand. Taking into account the analogy between real and financial options, the value of including an option to expand is explored by using the binomial options pricing model. In this way, estimating the value of the option to expand is a tool which facilitates the control of the uncertainty element implicit in the project. Keywords: Real options, Option to expand, Binomial options pricing model, Investment project appraisal

  20. Expand the Modeling Capabilities of DOE's EnergyPlus Building Energy Simulation Program

    Energy Technology Data Exchange (ETDEWEB)

    Don Shirey

    2008-02-28

    EnergyPlus{trademark} is a new generation computer software analysis tool that has been developed, tested, and commercialized to support DOE's Building Technologies (BT) Program in terms of whole-building, component, and systems R&D (http://www.energyplus.gov). It is also being used to support evaluation and decision making of zero energy building (ZEB) energy efficiency and supply technologies during new building design and existing building retrofits. Version 1.0 of EnergyPlus was released in April 2001, followed by semiannual updated versions over the ensuing seven-year period. This report summarizes work performed by the University of Central Florida's Florida Solar Energy Center (UCF/FSEC) to expand the modeling capabilities of EnergyPlus. The project tasks involved implementing, testing, and documenting the following new features or enhancement of existing features: (1) A model for packaged terminal heat pumps; (2) A model for gas engine-driven heat pumps with waste heat recovery; (3) Proper modeling of window screens; (4) Integrating and streamlining EnergyPlus air flow modeling capabilities; (5) Comfort-based controls for cooling and heating systems; and (6) An improved model for microturbine power generation with heat recovery. UCF/FSEC located existing mathematical models or generated new model for these features and incorporated them into EnergyPlus. The existing or new models were (re)written using Fortran 90/95 programming language and were integrated within EnergyPlus in accordance with the EnergyPlus Programming Standard and Module Developer's Guide. Each model/feature was thoroughly tested and identified errors were repaired. Upon completion of each model implementation, the existing EnergyPlus documentation (e.g., Input Output Reference and Engineering Document) was updated with information describing the new or enhanced feature. Reference data sets were generated for several of the features to aid program users in selecting proper

  1. The Accuracy of RADIANCE Software in Modelling Overcast Sky Condition

    OpenAIRE

    Baharuddin

    2013-01-01

    A validation study of the sky models of RADIANCE simulation software against the overcast sky condition has been carried out in order to test the accuracy of sky model of RADIANCE for modeling the overcast sky condition in Hong Kong. Two sets of data have been analysed. Firstly, data collected from a set of experiments using a physical scale model. In this experiment, the illuminance of four points inside the model was measured under real sky conditions. Secondly, the RADIANCE simulation has ...

  2. Capabilities and accuracy of energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-11-01

    Full Text Available consumption of future building will be. This can aid in selecting design changes to reduce energy consumption. Finally, calibrated energy models can be developed using the actual energy consumption results from a real building. Such a process can yield highly...

  3. A cognitive model for software architecture complexity

    NARCIS (Netherlands)

    Bouwers, E.; Lilienthal, C.; Visser, J.; Van Deursen, A.

    2010-01-01

    Evaluating the complexity of the architecture of a softwaresystem is a difficult task. Many aspects have to be considered to come to a balanced assessment. Several architecture evaluation methods have been proposed, but very few define a quality model to be used during the evaluation process. In

  4. The laws of software process a new model for the production and management of software

    CERN Document Server

    Armour, Phillip G

    2003-01-01

    The Nature of Software and The Laws of Software ProcessA Brief History of KnowledgeThe Characteristics of Knowledge Storage MediaThe Nature of Software DevelopmentThe Laws of Software Process and the Five Orders of IgnoranceThe Laws of Software ProcessThe First Law of Software ProcessThe Corollary to the First Law of Software ProcessThe Reflexive Creation of Systems and ProcessesThe Lemma of Eternal LatenessThe Second Law of Software ProcessThe Rule of Process BifurcationThe Dual Hypotheses of Knowledge DiscoveryArmour's Observation on Software ProcessThe Third Law of Software Process (also kn

  5. Architectural design of experience based factory model for software ...

    African Journals Online (AJOL)

    A model which is based on experience factory approach has been proposed earlier, called. EBF-SD, to overcome the limitations of experience management in software development domain. An application prototype, which is then called SDeX, is developed based on the proposed model. The study on correlation analysis ...

  6. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  7. Invention software support by integrating function and mathematical modeling

    NARCIS (Netherlands)

    Chechurin, L.S.; Wits, Wessel Willems; Bakker, H.M.

    2015-01-01

    New idea generation is imperative for successful product innovation and technology development. This paper presents the development of a novel type of invention support software. The support tool integrates both function modeling and mathematical modeling, thereby enabling quantitative analyses on a

  8. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  9. Aspects of system modelling in Hardware/Software partitioning

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1996-01-01

    This paper addresses fundamental aspects of system modelling and partitioning algorithms in the area of Hardware/Software Codesign. Three basic system models for partitioning are presented and the consequences of partitioning according to each of these are analyzed. The analysis shows...

  10. Application of Process Modeling in a Software- Engineering Course

    Directory of Open Access Journals (Sweden)

    Gabriel Alberto García Mireles

    2001-11-01

    Full Text Available Coordination in a software development project is a critical issue in delivering a successful software product, within the constraints of time, functionality and budget agreed upon with the customer. One of the strategies for approaching this problem consists in the use of process modeling to document, evaluate, and redesign the software development process. The appraisal of the projects done in the Engineering and Methodology course of a program given at the Ensenada Center of Scientific Research and Higher Education (CICESE, from a process perspective, facilitated the identification of strengths and weaknesses in the development process used. This paper presents the evaluation of the practical portion of the course, the improvements made, and the preliminary results of using the process approach in the analysis phase of a software-development project.

  11. The RAGE Software Asset Model and Metadata Model

    NARCIS (Netherlands)

    Georgiev, Atanas; Grigorov, Alexander; Bontchev, Boyan; Boytchev, Pavel; Stefanov, Krassen; Bahreini, Kiavash; Nyamsuren, Enkhbold; Van der Vegt, Wim; Westera, Wim; Prada, Rui; Hollins, Paul; Moreno, Pablo

    2016-01-01

    Software assets are key output of the RAGE project and they can be used by applied game developers to enhance the pedagogical and educational value of their games. These software assets cover a broad spectrum of functionalities – from player analytics including emotion detection to intelligent

  12. Coevolution of variability models and related software artifacts

    DEFF Research Database (Denmark)

    Passos, Leonardo; Teixeira, Leopoldo; Dinztner, Nicolas

    2015-01-01

    models coevolve with other artifact types, we study a large and complex real-world variant-rich software system: the Linux kernel. Specifically, we extract variability-coevolution patterns capturing changes in the variability model of the Linux kernel with subsequent changes in Makefiles and C source......Variant-rich software systems offer a large degree of customization, allowing users to configure the target system according to their preferences and needs. Facing high degrees of variability, these systems often employ variability models to explicitly capture user-configurable features (e...... to the evolution of different kinds of software artifacts, it is not surprising that industry reports existing tools and solutions ineffective, as they do not handle the complexity found in practice. Attempting to mitigate this overall lack of knowledge and to support tool builders with insights on how variability...

  13. The expanding epigenetic landscape of non-model organisms.

    Science.gov (United States)

    Bonasio, Roberto

    2015-01-01

    Epigenetics studies the emergence of different phenotypes from a single genotype. Although these processes are essential to cellular differentiation and transcriptional memory, they are also widely used in all branches of the tree of life by organisms that require plastic but stable adaptation to their physical and social environment. Because of the inherent flexibility of epigenetic regulation, a variety of biological phenomena can be traced back to evolutionary adaptations of few conserved molecular pathways that converge on chromatin. For these reasons chromatin biology and epigenetic research have a rich history of chasing discoveries in a variety of model organisms, including yeast, flies, plants and humans. Many more fascinating examples of epigenetic plasticity lie outside the realm of model organisms and have so far been only sporadically investigated at a molecular level; however, recent progress on sequencing technology and genome editing tools have begun to blur the lines between model and non-model organisms, opening numerous new avenues for investigation. Here, I review examples of epigenetic phenomena in non-model organisms that have emerged as potential experimental systems, including social insects, fish and flatworms, and are becoming accessible to molecular approaches. © 2015. Published by The Company of Biologists Ltd.

  14. Expanded Medical Home Model Works for Children in Foster Care

    Science.gov (United States)

    Jaudes, Paula Kienberger; Champagne, Vince; Harden, Allen; Masterson, James; Bilaver, Lucy A.

    2012-01-01

    The Illinois Child Welfare Department implemented a statewide health care system to ensure that children in foster care obtain quality health care by providing each child with a medical home. This study demonstrates that the Medical Home model works for children in foster care providing better health outcomes in higher immunization rates. These…

  15. Computing and software

    OpenAIRE

    White, G C; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential ...

  16. Computing and software

    OpenAIRE

    White, G C; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential anal...

  17. Software Piracy Detection Model Using Ant Colony Optimization Algorithm

    Science.gov (United States)

    Astiqah Omar, Nor; Zakuan, Zeti Zuryani Mohd; Saian, Rizauddin

    2017-06-01

    Internet enables information to be accessible anytime and anywhere. This scenario creates an environment whereby information can be easily copied. Easy access to the internet is one of the factors which contribute towards piracy in Malaysia as well as the rest of the world. According to a survey conducted by Compliance Gap BSA Global Software Survey in 2013 on software piracy, found out that 43 percent of the software installed on PCs around the world was not properly licensed, the commercial value of the unlicensed installations worldwide was reported to be 62.7 billion. Piracy can happen anywhere including universities. Malaysia as well as other countries in the world is faced with issues of piracy committed by the students in universities. Piracy in universities concern about acts of stealing intellectual property. It can be in the form of software piracy, music piracy, movies piracy and piracy of intellectual materials such as books, articles and journals. This scenario affected the owner of intellectual property as their property is in jeopardy. This study has developed a classification model for detecting software piracy. The model was developed using a swarm intelligence algorithm called the Ant Colony Optimization algorithm. The data for training was collected by a study conducted in Universiti Teknologi MARA (Perlis). Experimental results show that the model detection accuracy rate is better as compared to J48 algorithm.

  18. Model-Based Testing: The New Revolution in Software Testing

    Directory of Open Access Journals (Sweden)

    Hitesh KUMAR SHARMA

    2014-05-01

    Full Text Available The efforts spent on testing are enormous due to the continuing quest for better software quality, and the ever growing complexity of software systems. The situation is aggravated by the fact that the complexity of testing tends to grow faster than the complexity of the systems being tested, in the worst case even exponentially. Whereas development and construction methods for software allow the building of ever larger and more complex systems, there is a real danger that testing methods cannot keep pace with construction, hence these new systems cannot be sufficiently fast and thoroughly be tested. This may seriously hamper the development of future generations of software systems. One of the new technologies to meet the challenges imposed on software testing is model-based testing. Models can be utilized in many ways throughout the product life-cycle, including: improved quality of specifications, code generation, reliability analysis, and test generation. This paper will focus on the testing benefits from MBT methods and review some of the historical challenges that prevented model based testing and we also try to present the solutions that can overcome these challenges.

  19. Sharing Research Models: Using Software Engineering Practices for Facilitation.

    Science.gov (United States)

    Bryant, Stephanie P; Solano, Eric; Cantor, Susanna; Cooley, Philip C; Wagener, Diane K

    2011-03-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems' behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations-such as nonintuitive user interface features and data input specifications-may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices- the iterative software development process, object-oriented methodology, and Unified Modeling Language-and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers.

  20. Design and modeling balloon-expandable coronary stent for manufacturability

    Science.gov (United States)

    Suryawan, D.; Suyitno

    2017-02-01

    Coronary artery disease (CAD) is a disease that caused by narrowing of the coronary artery. The narrowing coronary artery is usually caused by cholesterol-containing deposit (plaque) which can cause a heart attack. CAD is the most common cause mortality in Indonesia. The commonly CAD treatment use the stent to opens or alleviate the narrowing coronary artery. In this study, the stent design is optimized for the manufacturability. Modeling is used to determine the free stent expansion due to applied pressure in the inner surface of the stent. The stress distribution, outer diameter change, and dogboning phenomena are investigated in the simulation. The result of modeling and simulating was analyzed and used to optimize the stent design before it is manufactured using EDM (Electric Discharge Machine) in the next research.

  1. USER STORY SOFTWARE ESTIMATION:A SIMPLIFICATION OF SOFTWARE ESTIMATION MODEL WITH DISTRIBUTED EXTREME PROGRAMMING ESTIMATION TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Ridi Ferdiana

    2011-01-01

    Full Text Available Software estimation is an area of software engineering concerned with the identification, classification and measurement of features of software that affect the cost of developing and sustaining computer programs [19]. Measuring the software through software estimation has purpose to know the complexity of the software, estimate the human resources, and get better visibility of execution and process model. There is a lot of software estimation that work sufficiently in certain conditions or step in software engineering for example measuring line of codes, function point, COCOMO, or use case points. This paper proposes another estimation technique called Distributed eXtreme Programming Estimation (DXP Estimation. DXP estimation provides a basic technique for the team that using eXtreme Programming method in onsite or distributed development. According to writer knowledge this is a first estimation technique that applied into agile method in eXtreme Programming.

  2. A Reference Model for Software and System Inspections. White Paper

    Science.gov (United States)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  3. [Development and verification of Chinese dietary exposure evaluation model software].

    Science.gov (United States)

    Liu, Pei; Li, Jing-xin; Sun, Jin-fang; Xue, Jianping; Chen, Bing-wei; Zhang, Hong; Yu, Xiao-jin; Wang, Can-nan; Yuan, Bao-jun; Ma, Yong-jian; Tian, Zi-hua

    2010-03-01

    To develop the dietary exposure evaluation model software accredited of Chinese intellectual property rights and to verify the rationality and accuracy of the results from the probabilistic model in Chinese dietary exposure evaluation model software according to international standards. The software of SAS was used to build various evaluation model based on the data from Chinese dietary survey and the chemical compound in food surveillance and to design an operation interface. The results from probabilistic dietary exposure model for children 2 - 7 years old were compared with that from duplicate portion study of 2-7 years children dietary exposure in Jinhu, Jiangsu province in order to analyze the rationality of model. The results from probabilistic model of dietary exposure were compared with the results from @Risk software to verify the correction of the probabilistic model by using the same data of randomly selected 10 000 study subjects from national dietary survey. While, the mean drift was used as an internal index to illustrate the accuracy of the computation. Chinese dietary exposure evaluation software was developed successfully. On the rationality, the results from probabilistic model were lower than that from the point estimation (e.g., cucumber: the result of point estimation of acephate was 4.78 microg x kg(-1) x d(-1), while the results of probabilistic model which was 0.39 microg x kg(-1) x d(-1)). Meanwhile the results from probabilistic model were higher than the results of duplicate portion study (on the P95, the result of probabilistic model of Pb exposure in children was 11.08 microg x kg(-1) x d(-1), while the results of duplicate portion study was 5.75 microg x kg(-1) x d(-1)). On accuracy, the results from @Risk and the probabilistic model were highly consistent (on the P95, the result of probabilistic assessment of acephate diet exposure was 4.27 microg x kg(-1) x d(-1), while the results of duplicate portion study was 4.24 microg x kg(-1

  4. Dynamics Modelling of Tensegrity Structures with Expanding Properties

    Science.gov (United States)

    Abdulkareem, Musa; Mahfouf, M.; Theilliol, D.

    Given the prestress level of a tensegrity structural system obtained from any form-finding method, an important step in the design process is to develop mathematical models that describe the behaviour of the system. Moreover, tensegrity structures are strongly dependent on their geometric, or kinematic, configurations. As such, except for small scale tensegrity structures with a few structural members, resorting to the use of computational techniques for analysis is a necessity. Because tensegrity structures are kinematically and statically indeterminate structures, a free standing tensegrity structure has at least one rigid body mode apart from the six rigid body modes that can be eliminated, for example, by applying boundary conditions assuming the structure is attached to a base. In this paper, a new general tool (applicable to small and large systems) for systematic and efficient formulation of structural models for tensegrity systems is proposed. Current tools are limited to structures with a few degrees of freedom (DOF), however, this new tool simplifies the analyses of tensegrity structures with several DOFs and provides a new insight into the behaviour of these interesting and yet challenging structures, at least from a control systems' viewpoint.

  5. Modelling and Evaluating Software Project Risks with Quantitative Analysis Techniques in Planning Software Development

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2015-01-01

    Risk is not always avoidable, but it is controllable. The aim of this paper is to present new techniques which use the stepwise regression analysis tomodel and evaluate the risks in planning software development and reducing risk with software process improvement. Top ten software risk factors in planning software development phase and thirty control factors were presented to respondents. This study incorporates risk management approach and planning software development to mitigate software p...

  6. Application of the AHP method in modeling the trust and reputation of software agents

    Science.gov (United States)

    Zytniewski, Mariusz; Klementa, Marek; Skorupka, Dariusz; Stanek, Stanislaw; Duchaczek, Artur

    2016-06-01

    Given the unique characteristics of cyberspace and, in particular, the number of inherent security threats, communication between software agents becomes a highly complex issue and a major challenge that, on the one hand, needs to be continuously monitored and, on the other, awaits new solutions addressing its vulnerabilities. An approach that has recently come into view mimics mechanisms typical of social systems and is based on trust and reputation that assist agents in deciding which other agents to interact with. The paper offers an enhancement to existing trust and reputation models, involving the application of the AHP method that is widely used for decision support in social systems, notably for risks analysis. To this end, it is proposed to expand the underlying conceptual basis by including such notions as self-trust and social trust, and to apply these to software agents. The discussion is concluded with an account of an experiment aimed at testing the effectiveness of the proposed solution.

  7. New Results in Software Model Checking and Analysis

    Science.gov (United States)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  8. An Evaluation of ADLs on Modeling Patterns for Software Architecture

    NARCIS (Netherlands)

    Waqas Kamal, Ahmad; Avgeriou, Paris

    2007-01-01

    Architecture patterns provide solutions to recurring design problems at the architecture level. In order to model patterns during software architecture design, one may use a number of existing Architecture Description Languages (ADLs), including the UML, a generic language but also a de facto

  9. Software-engineering-based model for mitigating Repetitive Strain ...

    African Journals Online (AJOL)

    The incorporation of Information and Communication Technology (ICT) in virtually all facets of human endeavours has fostered the use of computers. This has induced Repetitive Stress Injury (RSI) for continuous and persistent computer users. Proposing a software engineering model capable of enacted RSI force break ...

  10. Advanced quality prediction model for software architectural knowledge sharing

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris; Tang, Antony; Xu, Lai

    In the field of software architecture, a paradigm shift is occurring from describing the outcome of architecting process to describing the Architectural Knowledge (AK) created and used during architecting. Many AK models have been defined to represent domain concepts and their relationships, and

  11. Advances in Games Technology: Software, Models, and Intelligence

    Science.gov (United States)

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  12. ERP Software Selection Model using Analytic Network Process

    OpenAIRE

    Lesmana , Andre Surya; Astanti, Ririn Diar; Ai, The Jin

    2014-01-01

    During the implementation of Enterprise Resource Planning (ERP) in any company, one of the most important issues is the selection of ERP software that can satisfy the needs and objectives of the company. This issue is crucial since it may affect the duration of ERP implementation and the costs incurred for the ERP implementation. This research tries to construct a model of the selection of ERP software that are beneficial to the company in order to carry out the selection of the right ERP sof...

  13. Software Engineering Laboratory (SEL) relationships, models, and management rules

    Science.gov (United States)

    Decker, William; Hendrick, Robert; Valett, Jon D.

    1991-01-01

    Over 50 individual Software Engineering Laboratory (SEL) research results, extracted from a review of published SEL documentation, that can be applied directly to managing software development projects are captured. Four basic categories of results are defined and discussed - environment profiles, relationships, models, and management rules. In each category, research results are presented as a single page that summarizes the individual result, lists potential uses of the result by managers, and references the original SEL documentation where the result was found. The document serves as a concise reference summary of applicable research for SEL managers.

  14. Opensource Software for MLR-Modelling of Solar Collectors

    DEFF Research Database (Denmark)

    Bacher, Peder; Perers, Bengt

    2011-01-01

    A first research version is now in operation of a software package for multiple linear regression (MLR) modeling and analysis of solar collectors according to ideas originating all the way from Walletun et. al. (1986), Perers, (1987 and 1993). The tool has been implemented in the free and open...... source program R http://www.r-project.org/. Applications of the software package includes: visual validation, resampling and conversion of data, collector performance testing analysis according to the European Standard EN 12975 (Fischer et al., 2004), statistical validation of results...... with a new Excel tool connected to EN 12975 (Kovacs, 2011) this built in validation gives an extra quality assurance....

  15. Modeling the geographical studies with GeoGebra-software

    Directory of Open Access Journals (Sweden)

    Ionica Soare

    2010-01-01

    Full Text Available The problem of mathematical modeling in geography is one of the most important strategies in order to establish the evolution and the prevision of geographical phenomena. Models must have a simplified structure, to reflect essential components and must be selective, structured, and suggestive and approximate the reality. Models could be static or dynamic, developed in a theoretical, symbolic, conceptual or mental way, mathematically modeled. The present paper is focused on the virtual model which uses GeoGebra software, free and available at www.geogebra.org, in order to establish new methods of geographical analysis in a dynamic, didactic way.

  16. Frame-rate performance modeling of software MPEG decoder

    Science.gov (United States)

    Ramamoorthy, Victor

    1997-01-01

    A software MPEG decoder, though attractive in terms of performance and cost, opens up new technical challenges. The most critical question is: When does a software decoder drop a frame? How to predict its timing performance well ahead of its implementation? It is not easy to answer these questions without introducing a stochastic model of the decoding time. With a double buffering scheme, fluctuations in decoding time can be smoothed out to a large extent. However, dropping of frames can not be totally eliminated. New ideas of slip and asymptotic synchronous locking are shown to answer critical design questions of a software decoder. Beneath the troubled world of frame droppings lies the beauty and harmony of our stochastic formulation.

  17. A software for parameter estimation in dynamic models

    Directory of Open Access Journals (Sweden)

    M. Yuceer

    2008-12-01

    Full Text Available A common problem in dynamic systems is to determine parameters in an equation used to represent experimental data. The goal is to determine the values of model parameters that provide the best fit to measured data, generally based on some type of least squares or maximum likelihood criterion. In the most general case, this requires the solution of a nonlinear and frequently non-convex optimization problem. Some of the available software lack in generality, while others do not provide ease of use. A user-interactive parameter estimation software was needed for identifying kinetic parameters. In this work we developed an integration based optimization approach to provide a solution to such problems. For easy implementation of the technique, a parameter estimation software (PARES has been developed in MATLAB environment. When tested with extensive example problems from literature, the suggested approach is proven to provide good agreement between predicted and observed data within relatively less computing time and iterations.

  18. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  19. Effective Team Support: From Modeling to Software Agents

    Science.gov (United States)

    Remington, Roger W. (Technical Monitor); John, Bonnie; Sycara, Katia

    2003-01-01

    The purpose of this research contract was to perform multidisciplinary research between CMU psychologists, computer scientists and engineers and NASA researchers to design a next generation collaborative system to support a team of human experts and intelligent agents. To achieve robust performance enhancement of such a system, we had proposed to perform task and cognitive modeling to thoroughly understand the impact technology makes on the organization and on key individual personnel. Guided by cognitively-inspired requirements, we would then develop software agents that support the human team in decision making, information filtering, information distribution and integration to enhance team situational awareness. During the period covered by this final report, we made substantial progress in modeling infrastructure and task infrastructure. Work is continuing under a different contract to complete empirical data collection, cognitive modeling, and the building of software agents to support the teams task.

  20. ESPC Common Model Architecture Earth System Modeling Framework (ESMF) Software and Application Development

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. ESPC Common Model Architecture Earth System Modeling...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The...model architecture and other software-related standards in this project. OBJECTIVES NUOPC proposes to accelerate improvement of our national

  1. Software for multimodal battlefield signal modeling and optimal sensor placement

    Science.gov (United States)

    Yamamoto, Kenneth K.; Vecherin, Sergey N.; Wilson, D. Keith; Borden, Christian T.; Bettencourt, Elizabeth; Pettit, Chris L.

    2012-05-01

    Effective use of passive and active sensors for surveillance, security, and intelligence must consider terrain and atmospheric effects on the sensor performance. Several years ago, U.S. Army ERDC undertook development of software for modeling environmental effects on target signatures, signal propagation, and battlefield sensors for many signal modalities (e.g., optical, acoustic, seismic, magnetic, radio-frequency, chemical, biological, and nuclear). Since its inception, the software, called Environmental Awareness for Sensor and Emitter Employment (EASEE), has matured and evolved significantly for simulating a broad spectrum of signal-transmission and sensing scenarios. The underlying software design involves a flexible, object-oriented approach to the various stages of signal modeling from emission through processing into inferences. A sensor placement algorithm has also been built in for optimizing sensor selections and placements based on specification of sensor supply limitations, coverage priorities, and wireless sensor communication requirements. Some recent and ongoing enhancements are described, including modeling of active sensing scenarios and signal reflections, directivity of signal emissions and sensors, improved handling of signal feature dependencies, extensions to realistically model additional signal modalities such as infrared and RF, and XML-based communication with other calculation and display engines.

  2. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  3. Duplication Detection When Evolving Feature Models of Software Product Lines

    Directory of Open Access Journals (Sweden)

    Amal Khtira

    2015-10-01

    Full Text Available After the derivation of specific applications from a software product line, the applications keep evolving with respect to new customer’s requirements. In general, evolutions in most industrial projects are expressed using natural language, because it is the easiest and the most flexible way for customers to express their needs. However, the use of this means of communication has shown its limits in detecting defects, such as inconsistency and duplication, when evolving the existing models of the software product line. The aim of this paper is to transform the natural language specifications of new evolutions into a more formal representation using natural language processing. Then, an algorithm is proposed to automatically detect duplication between these specifications and the existing product line feature models. In order to instantiate the proposed solution, a tool is developed to automatize the two operations.

  4. Usage models in reliability assessment of software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P.; Pulkkinen, U. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland)

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.).

  5. Software analysis for modeling the parameters of shunting locomotives chassis

    Directory of Open Access Journals (Sweden)

    Falendysh Anatoliy

    2017-01-01

    Full Text Available The article provides an overview of software designed to perform the simulation of structures, calculate their states, and respond to the effects of loads applied to any of the points in the model. In this case, we are interested in the possibility of modeling the locomotive chassis frames, with the possibility of determining the weakest points of their construction, determination of the remaining life of the structure. For this purpose, the article presents a developed model for calculating the frame of the diesel locomotive chassis, taking into account technical, economic and other parameters.

  6. Transfer Learning for Improving Model Predictions in Highly Configurable Software

    OpenAIRE

    Jamshidi, Pooyan; Velez, Miguel; Kästner, Christian; Siegmund, Norbert; Kawthekar, Prasad

    2017-01-01

    Modern software systems are built to be used in dynamic environments using configuration capabilities to adapt to changes and external uncertainties. In a self-adaptation context, we are often interested in reasoning about the performance of the systems under different configurations. Usually, we learn a black-box model based on real measurements to predict the performance of the system given a specific configuration. However, as modern systems become more complex, there are many configuratio...

  7. APLIKASI MODEL PENERIMAAN TEKNOLOGI DALAM PENGGUNAAN SOFTWARE AUDIT OLEH AUDITOR

    Directory of Open Access Journals (Sweden)

    Dhini Suryandini

    2012-03-01

    Full Text Available Tujuan penelitian ini adalah untuk menguji faktor yang mempengaruhi penerimaan auditor dari perangkat lunak audit dengan menggunakan Model Penerimaan Teknologi (TAM. Data dikumpulkan dengan menggunakan metode survei melalui surat dan email yang dikirim ke auditor di 4 perusahaan-perusahaan CPA di Indonesia. Data dianalisis dengan menggunakan Partial Least Square (PLS. Metode tersebut merupakan metode alternatif Pemodelan Persamaan Struktur dengan menggunakan program aplikasi Smart PLS. Hasil penelitian ini menunjukkan bahwa ada hubungan positif antara perceived usefulness (PU dan attitude terhadap penggunaan audit software (ATT, antara perceived usefulness (PU dan actual use (AU, experience (EXP dan perceived usefulness (PU, dan computer-self-efficacy (CSE dan perceived ease of use (PEOU. Ada 2 faktor yang memiliki pengaruh signifikan baik secara langsung dan tidak langsung pada penerimaan auditor dari perangkat lunak audit. Perceived usefulnes mempunyai pengaruh positif secara langsung terhadap actual use (AU. The aim of this research is to test the factors, influencing the auditor acceptance of audit software by using Technology Acceptance Model (TAM. The data were collected by using survey method through mail and email delivered to the auditors in big 4 CPA’s firms in Indonesia. The data were analyzed by using Partial Least Square (PLS method. It is the alternative method of Structural Equation Modeling (SEM in which Smart PLS application program is applied. The results of this research indicate that there are positive relationship between perceived usefulness (PU and attitude to the use of the audit software (ATT, between perceived usefulness (PU and actual use (AU, between experience (EXP and perceived usefulness (PU, and between computer-self-efficacy (CSE and perceived ease of use (PEOU.  There are 2 factors that have significant influence on the auditor acceptance of audit software directly and indirectly. Perceived

  8. Induction of continuous expanding infrarenal aortic aneurysms in a large porcine animal model

    Directory of Open Access Journals (Sweden)

    Brian O. Kloster

    2015-03-01

    Conclusion: In pigs it's possible to induce continuous expanding AAA's based upon proteolytic degradation and pathological flow, resembling the real life dynamics of human aneurysms. Because the lumbars are preserved, it's also a potential model for further studies of novel endovascular devices and their complications.

  9. Predicting Examination Performance Using an Expanded Integrated Hierarchical Model of Test Emotions and Achievement Goals

    Science.gov (United States)

    Putwain, Dave; Deveney, Carolyn

    2009-01-01

    The aim of this study was to examine an expanded integrative hierarchical model of test emotions and achievement goal orientations in predicting the examination performance of undergraduate students. Achievement goals were theorised as mediating the relationship between test emotions and performance. 120 undergraduate students completed…

  10. Modeling of ion exchange expanded-bed chromatography for the purification of C-phycocyanin.

    Science.gov (United States)

    Moraes, Caroline Costa; Mazutti, Marcio A; Maugeri, Francisco; Kalil, Susana Juliano

    2013-03-15

    This work is focused on the experimental evaluation and mathematical modeling of ion exchange expanded-bed chromatography for the purification of C-phycocyanin from crude fermentative broth containing Spirulina platensis cells. Experiments were carried out in different expansion degree to evaluate the process performance. The experimental breakthrough curves were used to estimate the mass transfer and kinetics parameters of the proposed model, using the Particle Swarm Optimization algorithm (PSO). The proposed model satisfactorily fitted the experimental data. The results from the model application pointed out that the increase in the initial bed height does not influence the process efficiency, however enables the operation of expanded-bed column at high volumetric flow rates, improving the productivity. It was also shown that the use of mathematical modeling was a good and promising tool for the optimization of chromatographic processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Software sensors based on the grey-box modelling approach

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.; Strube, Rune

    1996-01-01

    -box model for the specific dynamics is identified. Similarly, an on-line software sensor for detecting the occurrence of backwater phenomena can be developed by comparing the dynamics of a flow measurement with a nearby level measurement. For treatment plants it is found that grey-box models applied to on......-line measurements. With respect to the development of software sensors, the grey-box models possess two important features. Firstly, the on-line measurements can be filtered according to the grey-box model in order to remove noise deriving from the measuring equipment and controlling devices. Secondly, the grey......-box models may contain terms which can be estimated on-line by use of the models and measurements. In this paper, it is demonstrated that many storage basins in sewer systems can be used as an on-line flow measurement provided that the basin is monitored on-line with a level transmitter and that a grey...

  12. A Formal Model and Verification Problems for Software Defined Networks

    Directory of Open Access Journals (Sweden)

    V. A. Zakharov

    2013-01-01

    Full Text Available Software-defined networking (SDN is an approach to building computer networks that separate and abstract data planes and control planes of these systems. In a SDN a centralized controller manages a distributed set of switches. A set of open commands for packet forwarding and flow-table updating was defined in the form of a protocol known as OpenFlow. In this paper we describe an abstract formal model of SDN, introduce a tentative language for specification of SDN forwarding policies, and set up formally model-checking problems for SDN.

  13. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  14. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  15. Model-Driven Reengineering Legacy Software Systems to Web Services

    National Research Council Canada - National Science Library

    Cao, Fei; Bryant, Barrett R; Zhao, Wei; Burt, Carol C; Raje, Rajeev R; Olson, Andrew M; Auguston, Mikhail

    2005-01-01

    .... In this paper, we present a comprehensive, systematic, automatable approach toward reengineering legacy software systems to WS applications, rather than rewriting the whole legacy software system...

  16. Evolving a Simulation Model Product Line Software Architecture from Heterogeneous Model Representations

    Science.gov (United States)

    2003-09-01

    Crouching Dragon , Hidden Software: Software in DoD Weapon Systems,” IEEE Software, Vol. 18, No. 4, July-August 2001, pp. 105-107. [FG99...objects hidden , Generalization is a form of abstraction establishing an is-a relationship between the specialized and generic object, when similar...concept, [CGG98] explained the JSIMS Military Modeling Framework initiative, a De- partment enterprise-wide Tiger Team initiative to develop Mission

  17. Home-based chronic care. An expanded integrative model for home health professionals.

    Science.gov (United States)

    Suter, Paula; Hennessey, Beth; Harrison, Gregory; Fagan, Martha; Norman, Barbara; Suter, W Newton

    2008-04-01

    The Chronic Care Model (CCM) developed by is an influential and accepted guide for the care of patients with chronic disease. Wagner acknowledges a current healthcare focus on acute care needs that often circumvents chronic care coordination. He identifies the need for a "division of labor" to assist the primary care physician with this neglected function. This article posits that the role of chronic care coordination assistance and disease management fits within the purview of home healthcare and should be central to home health chronic care delivery. An expanded Home-Based Chronic Care Model (HBCCM) is described that builds on Wagner's model and integrates salient theories from fields beyond medicine. The expanded model maximizes the potential for disease self-management success and is intended to provide a foundation for home health's integral role in chronic disease management.

  18. Plans for performance and model improvements in the LISE++ software

    Science.gov (United States)

    Kuchera, M. P.; Tarasov, O. B.; Bazin, D.; Sherrill, B. M.; Tarasova, K. V.

    2016-06-01

    The LISE++ software for fragment separator simulations is undergoing a major update. LISE++ is the standard software used at in-flight separator facilities for predicting beam intensity and purity. The code simulates nuclear physics experiments where fragments are produced and then selected with a fragment separator. A set of modifications to improve the functionality of the code is discussed in this work. These modifications include transportation to a modern graphics framework and updated compilers to aid in the performance and sustainability of the code. To accommodate the diversity of our users' computer platform preferences, we extend the software from Windows to a cross-platform application. The calculations of beam transport and isotope production are becoming more computationally intense with the new large scale facilities. Planned new features include new types of optimization, for example, optimization of ion optics, improvements in reaction models, and new event generator options. In addition, LISE++ interface with control systems are planned. Computational improvements as well as the schedule for updating this large package will be discussed.

  19. Combining Static Analysis and Model Checking for Software Analysis

    Science.gov (United States)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  20. An Extensive Study on the Future of Modeling in Software Development

    OpenAIRE

    Avadhesh Kumar Gupta; Satish Kumar

    2014-01-01

    In recent years, software modeling realized much attention in the field of software research and development due to demonstrating the capability of decreases time and cost of software development and also improves overall quality of software development process in the form of understanding, communications, correctness, traceability, and developer productivity etc. Modeling has great potential to proven mainstream of software development, especially after using analysis and design phase to im...

  1. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    Science.gov (United States)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  2. Accurate Estimation of Target amounts Using Expanded BASS Model for Demand-Side Management

    Science.gov (United States)

    Kim, Hyun-Woong; Park, Jong-Jin; Kim, Jin-O.

    2008-10-01

    The electricity demand in Korea has rapidly increased along with a steady economic growth since 1970s. Therefore Korea has positively propelled not only SSM (Supply-Side Management) but also DSM (Demand-Side Management) activities to reduce investment cost of generating units and to save supply costs of electricity through the enhancement of whole national energy utilization efficiency. However study for rebate, which have influence on success or failure on DSM program, is not sufficient. This paper executed to modeling mathematically expanded Bass model considering rebates, which have influence on penetration amounts for DSM program. To reflect rebate effect more preciously, the pricing function using in expanded Bass model directly reflects response of potential participants for rebate level.

  3. Charging Customers or Making Profit? Business Model Change in the Software Industry

    National Research Council Canada - National Science Library

    Margit Malmmose Peyton; Rainer Lueg; Sevar Khusainova; Patrick Sønderskov Iversen; Seth Boampong Panti

    2014-01-01

    ...), and software-specific frameworks for Business Models have emerged (Popp, 2011; Rajala et al., 2003; Rajala et al., 2004; Stahl, 2004). This paper attempts to illustrate Business Model change in the software industry...

  4. Design of Multithreaded Software The Entity-Life Modeling Approach

    CERN Document Server

    Sandén, Bo I

    2011-01-01

    This book assumes familiarity with threads (in a language such as Ada, C#, or Java) and introduces the entity-life modeling (ELM) design approach for certain kinds of multithreaded software. ELM focuses on "reactive systems," which continuously interact with the problem environment. These "reactive systems" include embedded systems, as well as such interactive systems as cruise controllers and automated teller machines.Part I covers two fundamentals: program-language thread support and state diagramming. These are necessary for understanding ELM and are provided primarily for reference. P

  5. Modeling of ultrasonic processes utilizing a generic software framework

    Science.gov (United States)

    Bruns, P.; Twiefel, J.; Wallaschek, J.

    2017-06-01

    Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.

  6. A TAXONOMY OF CLOSED AND OPEN SOURCE SOFTWARE INDUSTRY BUSINESS MODELS

    OpenAIRE

    THOMAS A. HEMPHILL

    2006-01-01

    This paper explores the closed source/open source software model dialectic and presents a conceptual framework of an emerging "both source" software business model useful for formulating firm technology strategy. The author reviews the academic literature and popular business press, extracting definitions and concepts underlying these software models; explains the concept of a business model and makes the case for the knowledge-based view of the firm as the approach to software industry busin...

  7. Practical Application of Model Checking in Software Verification

    Science.gov (United States)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  8. A novel breast software phantom for biomechanical modeling of elastography.

    Science.gov (United States)

    Bhatti, Syeda Naema; Sridhar-Keralapura, Mallika

    2012-04-01

    In developing breast imaging technologies, testing is done with phantoms. Physical phantoms are normally used but their size, shape, composition, and detail cannot be modified readily. These difficulties can be avoided by creating a software breast phantom. Researchers have created software breast phantoms using geometric and/or mathematical methods for applications like image fusion. The authors report a 3D software breast phantom that was built using a mechanical design tool, to investigate the biomechanics of elastography using finite element modeling (FEM). The authors propose this phantom as an intermediate assessment tool for elastography simulation; for use after testing with commonly used phantoms and before clinical testing. The authors design the phantom to be flexible in both, the breast geometry and biomechanical parameters, to make it a useful tool for elastography simulation. The authors develop the 3D software phantom using a mechanical design tool based on illustrations of normal breast anatomy. The software phantom does not use geometric primitives or imaging data. The authors discuss how to create this phantom and how to modify it. The authors demonstrate a typical elastography experiment of applying a static stress to the top surface of the breast just above a simulated tumor and calculate normal strains in 3D and in 2D with plane strain approximations with linear solvers. In particular, they investigate contrast transfer efficiency (CTE) by designing a parametric study based on location, shape, and stiffness of simulated tumors. The authors also compare their findings to a commonly used elastography phantom. The 3D breast software phantom is flexible in shape, size, and location of tumors, glandular to fatty content, and the ductal structure. Residual modulus, maps, and profiles, served as a guide to optimize meshing of this geometrically nonlinear phantom for biomechanical modeling of elastography. At best, low residues (around 1-5 KPa) were

  9. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...

  10. BioSTAR, a New Biomass and Yield Modeling Software

    Science.gov (United States)

    Kappas, M.; Degener, J.; Bauboeck, R.

    2013-12-01

    BioSTAR (Biomass Simulation Tool for Agricultural Recourses) is a new crop model which has been developed at the University of Göttingen for the assessment of agricultural biomass potentials in Lower Saxony, Germany. Lower Saxony is a major agricultural producer in Germany and in the EU, and biogas facilities which either use agricultural crops or manure or both have seen a strong boom in the last decade. To be able to model the potentials of these agricultural bioenergy crops was the objective of developing the BioSTAR model. BioSTAR is kept simple enough to be usable even for non-scientific users, e.g. staff in planning offices or farmers. The software of the model is written in Java and uses a Microsoft Access database connection to read its input data and write its output data. In this sense the software architecture is something entirely new as far as existing crop models are concerned. The database connection enables very fast editing of the various data sources which are needed to run a crop simulation and fosters the organization of this data. Due to the software setup, the amount of individual sites which can be processed with a few clicks is only limited by the maximum size of an Access database (2 GB) and thus allows datasets of 105 sites or more to be stored and processed. Data can easily be copied or imported from Excel. Capabilities of the crop model are: simulation of single or multiple year crop growth with total biomass production, evapotranspiration, soil water budget of a 16 layered soil profile and, nitrogen budget. The original growth engine of the model was carbon based (Azam-Ali, et al., 1994), but a radiation use efficiency and two transpiration based growth engines were added at a later point. Before each simulation run, the user can choose between these four growth engines and four different ET0-methods, or use an ensemble of them. Up to date (07/2013), the model has been calibrated for several winter and spring cereals, canola, maize

  11. Modeling Psychologists' Ethical Intention: Application of an Expanded Theory of Planned Behavior.

    Science.gov (United States)

    Ferencz-Kaddari, Michall; Shifman, Annie; Koslowsky, Meni

    2016-06-01

    At the core of all therapeutic and medical practice lies ethics. By applying an expanded Ajzen's Theory of Planned Behavior formulation, the present investigation tested a model for explaining psychologists' intention to behave ethically. In the pretest, dual relationships and money conflicts were seen as the most prevalent dilemmas. A total of 395 clinical psychologists filled out questionnaires containing either a dual relationship dilemma describing a scenario where a psychologist was asked to treat a son of a colleague or a money-focused dilemma where he or she was asked to treat a patient unable to pay for the service. Results obtained from applying the expanded Ajzen's model to each dilemma, generally, supported the study hypotheses. In particular, attitudes were seen as the most important predictor in both dilemmas followed by a morality component, defined here as the commitment of the psychologist to the patient included here as an additional predictor in the model. The expanded model provided a better understanding of ethical intention. Practical implications were also discussed. © The Author(s) 2016.

  12. Induction of continuous expanding infrarenal aortic aneurysms in a large porcine animal model

    DEFF Research Database (Denmark)

    Kloster, Brian Ozeraitis; Lund, Lars; Lindholt, Jes S.

    2015-01-01

    BACKGROUND: A large animal model with a continuous expanding infrarenal aortic aneurysm gives access to a more realistic AAA model with anatomy and physiology similar to humans, and thus allows for new experimental research in the natural history and treatment options of the disease. METHODS: 10...... ultrasound, hereafter the pigs were euthanized for inspection and AAA wall sampling for histological analysis. RESULTS: In group A, all pigs developed continuous expanding AAA's with a mean increase in AP-diameter to 16.26 ± 0.93 mm equivalent to a 57% increase. In group B the AP-diameters increased to 11.......33 ± 0.13 mm equivalent to 9.3% which was significantly less than in group A (p Histology shoved more or less complete resolution of the elastic tissue in the tunica media...

  13. Induction of continuous expanding infrarenal aortic aneurysms in a large porcine animal model

    DEFF Research Database (Denmark)

    Kloster, Brian Ozeraitis; Lund, Lars; Lindholt, Jes S.

    2015-01-01

    BackgroundA large animal model with a continuous expanding infrarenal aortic aneurysm gives access to a more realistic AAA model with anatomy and physiology similar to humans, and thus allows for new experimental research in the natural history and treatment options of the disease. Methods10 pigs......, hereafter the pigs were euthanized for inspection and AAA wall sampling for histological analysis. ResultsIn group A, all pigs developed continuous expanding AAA’s with a mean increase in AP-diameter to 16.26 ± 0.93 mm equivalent to a 57% increase. In group B the AP-diameters increased to 11.33 ± 0.13 mm...... equivalent to 9.3% which was significantly less than in group A (pHistology shoved more or less complete resolution of the elastic tissue in the tunica media in group A. The most...

  14. Mathematical model and software for control of commissioning blast furnace

    Science.gov (United States)

    Spirin, N. A.; Onorin, O. P.; Shchipanov, K. A.; Lavrov, V. V.

    2016-09-01

    Blowing-in is a starting period of blast furnace operation after construction or major repair. The current approximation methods of blowing-in burden analysis are based on blowing-in practice of previously commissioned blast furnaces. This area is theoretically underexplored; there are no common scientifically based methods for selection of the burden composition and blast parameters. The purpose of this paper is development and scientific substantiation of the methods for selection of the burden composition and blast parameters in the blast furnace during the blowing-in period. Research methods are based on physical regularities of main processes running in the blast furnace, system analysis, and application of modern principles for development and construction of mathematical models, algorithms and software designed for automated control of complex production processes in metallurgy. As consequence of the research made by the authors the following results have been achieved: 1. A set of mathematical models for analysis of burden arrangement throughout the height of the blast furnace and for selection of optimal blast and gas dynamic parameters has been developed. 2. General principles for selection of the blowing-in burden composition and blast and gas dynamic parameters have been set up. 3. The software for the engineering and process staff of the blast furnace has been developed and introduced in the industry.

  15. A framework for scientific data modeling and automated software development.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Vranken, Wim F; Pajon, Anne; Stevens, Tim J; Bhat, T N; Westbrook, John; Ionides, John M C; Laue, Ernest D

    2005-04-15

    The lack of standards for storage and exchange of data is a serious hindrance for the large-scale data deposition, data mining and program interoperability that is becoming increasingly important in bioinformatics. The problem lies not only in defining and maintaining the standards, but also in convincing scientists and application programmers with a wide variety of backgrounds and interests to adhere to them. We present a UML-based programming framework for the modeling of data and the automated production of software to manipulate that data. Our approach allows one to make an abstract description of the structure of the data used in a particular scientific field and then use it to generate fully functional computer code for data access and input/output routines for data storage, together with accompanying documentation. This code can be generated simultaneously for different programming languages from a single model, together with, for example for format descriptions and I/O libraries XML and various relational databases. The framework is entirely general and could be applied in any subject area. We have used this approach to generate a data exchange standard for structural biology and analysis software for macromolecular NMR spectroscopy. The framework is available under the GPL license, the data exchange standard with generated subroutine libraries under the LGPL license. Both may be found at http://www.ccpn.ac.uk; http://sourceforge.net/projects/ccpn ccpn@mole.bio.cam.ac.uk.

  16. Software Model Checking of ARINC-653 Flight Code with MCP

    Science.gov (United States)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  17. Evolution of the 'Trick' Dynamic Software Executive and Model Libraries for Reusable Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to a need for cost-effective small satellite missions, Odyssey Space Research is proposing the development of a common flight software executive and a...

  18. Study of the Warranty Cost Model for Software Reliability with an Imperfect Debugging Phenomenon

    OpenAIRE

    WILLIAMS, D. R. PRINCE

    2014-01-01

    Software reliability is one of the most important characteristics of software quality. Its measurement and management technologies employed during the software life-cycle are essential for producing and maintaining quality/reliable software systems. Herein, we discuss a modified approach to calculating the delivery cost of a software product, when warranty is to be provided, with an imperfect debugging phenomenon. Unlike existing cost models, here the strategy was to consider mainte...

  19. A Software Process Framework for the Software Engineering Institute’s (SEI) Capability Maturity Model

    Science.gov (United States)

    1994-09-01

    library of software proceo s- related documentation. (L3-45, A4, 5) Organization’s software process database. (L3-39, C 1, 4) Organization’s standard...whether to terminate the effort, proceed with broad-scale implementation of the technology, or replan and continue the pilot effort. Appropriate new...continued from the previous page. Output Org. Output References Decision whether to terminate the effort (pilot effort for improving technology

  20. A Step towards Software Corrective Maintenance: Using RCM model

    OpenAIRE

    Shahid Hussain; Bashir Ahmad; Muhammad Zubair Asghar

    2009-01-01

    From the preliminary stage of software engineering, selection of appropriate enforcement of standards remained a challenge for stakeholders during entire cycle of software development, but it can lead to reduce the efforts desired for software maintenance phase. Corrective maintenance is the reactive modification of a software product performed after delivery to correct discovered faults. Studies conducted by different researchers reveal that approximately 50 to 75% of the effort is spent on ...

  1. A Step towards Software Corrective Maintenance Using RCM model

    OpenAIRE

    Hussain, Shahid; Asghar, Muhammad Zubair; Ahmad, Bashir; Ahmad, Shakeel

    2009-01-01

    From the preliminary stage of software engineering, selection of appropriate enforcement of standards remained a challenge for stakeholders during entire cycle of software development, but it can lead to reduce the efforts desired for software maintenance phase. Corrective maintenance is the reactive modification of a software product performed after delivery to correct discovered faults. Studies conducted by different researchers reveal that approximately 50 to 75 percent of the effort is sp...

  2. An investigation of modelling and design for software service applications.

    Science.gov (United States)

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  3. Increasing the reliability of ecological models using modern software engineering techniques

    Science.gov (United States)

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  4. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    Energy Technology Data Exchange (ETDEWEB)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth (Oak Ridge National Laboratory, Oak Ridge, TN); Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

  5. A generic data-driven software reliability model with model mining technique

    Energy Technology Data Exchange (ETDEWEB)

    Yang Bo, E-mail: yangbo@uestc.edu.c [School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, Sichuan 611731 (China); Li Xiang; Xie Min [Department of Industrial and Systems Engineering, National University of Singapore, 119260 Singapore (Singapore); Tan Feng [Department of Industrial Engineering, University of Electronic Science and Technology of China, Chengdu, Sichuan 611731 (China)

    2010-06-15

    Complex systems contain both hardware and software, and software reliability becomes more and more essential in system reliability context. In recent years, data-driven software reliability models (DDSRMs) with multiple-delayed-input single-output (MDISO) architecture have been proposed and studied. For these models, the software failure process is viewed as a time series and it is assumed that a software failure is strongly correlated with the most recent failures. In reality, this assumption may not be valid and hence the model performance would be affected. In this paper, we propose a generic DDSRM with MDISO architecture by relaxing this unrealistic assumption. The proposed model can cater for various failure correlations and existing DDSRMs are special cases of the proposed model. A hybrid genetic algorithm (GA)-based algorithm is developed which adopts the model mining technique to discover the correlation of failures and to obtain optimal model parameters. Numerical examples are presented and results reveal that the proposed model outperforms existing DDSRMs.

  6. A framework to introduce flexibility in crop modelling: from conceptual modelling to software engineering and back

    NARCIS (Netherlands)

    Adam, M.Y.O.

    2010-01-01

    Keywords: model structure, uncertainty, modularity, software design patterns, good modelling practices, crop growth and development. This thesis is an account of the development and use of a framework to introduce flexibility in crop modelling. The construction of such a framework is supported by

  7. Understanding Cervical Cancer Screening Intentions Among Latinas Using An Expanded Theory of Planned Behavior Model

    Science.gov (United States)

    Roncancio, Angelica M.; Ward, Kristy K.; Fernandez, Maria E.

    2016-01-01

    We examined the utility of an expanded Theory of Planned Behavior (TPB) model in predicting cervical cancer screening intentions among Latinas. The model included acculturation and past cervical cancer screening behavior along with attitude, subjective norms, and perceived behavioral control. This cross-sectional study included a sample of 206 Latinas who responded to a self-administered survey. Structural equation modeling was employed to test the expanded TPB model. Acculturation (p= .025) and past screening behavior (p= .001) along with attitude (p= .019), subjective norms (p= .028), and perceived behavioral control (p= .014) predicted the intention to be screened for cervical cancer. Our findings suggest that the TPB is a useful model for understanding cervical cancer screening intentions among Latinas when both past behavior and culture are included. This highlights the importance of culture on behavior and indicates a need to develop culturally sensitive, theory-based interventions to encourage screening and reduce cervical cancer-related health disparities in Latinas. PMID:23930898

  8. An information model for use in software management estimation and prediction

    Science.gov (United States)

    Li, Ningda R.; Zelkowitz, Marvin V.

    1993-01-01

    This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.

  9. Determination of expander apparatus displacements and contact pressures on the mucosa using FEM modelling considering mandibular asymmetries.

    Science.gov (United States)

    Braga, Iracema Utsch; Rocha, Daniel Neves; Utsch, Ricardo Luiz; Las Casas, Estevam Barbosa; Andrade, Roberto Márcio; Jorge, Renato Natal; Braga, Rafael Utsch

    2013-01-01

    This paper presents a method for prediction of forces and displacements in the expansion screw of a modified mandibular Schwarz appliance and the contact pressure distributions on the mucosa during malocclusions treatment. A 3D finite element biomechanical model of the complete mandible-mucosa-apparatus set was built using computerised tomographic images of a patient's mandible and constructive solid geometry by computer software. An iterative procedure was developed to handle a boundary condition that takes into account the mandibular asymmetries. The results showed asymmetries in the contact pressure distributions that indicated with precision the patient's malocclusion diagnosis. In vivo measurements of contact pressure using piezoelectric sensors agreed with the computational results. It was shown that the left and right ends of the expansion screw move differently with respect to the patient mandible, even though the expansion screw has an opening mechanism to ensure equal stretching at both ends. The contact pressures between the apparatus and the mucosa vary linearly with applied forces, which can simplify the analysis of the biomechanical behaviour of the expander mandible apparatus. The biomechanical modelling proposed in this paper can be a useful tool to improve malocclusions treatment, safely avoiding the use of forces acting on live structures beyond the biological tolerance, which could result in traumatic effects.

  10. COZOIL: coastal zone oil spill model (for microcomputers). Software

    Energy Technology Data Exchange (ETDEWEB)

    Roy, J.

    1988-01-01

    COZOIL, developed and tested for use by the Department of the Interior Minerals Management Service, was developed as a generic, computer-based model for the simulation of oil spills entering the surf zone, impacting a shoreline, and transforming through time as a result of physical and chemical processes. COZOIL builds on previous oil-spill trajectory and fates models, which typically end with contact at the coastline. It also includes explicit representation of as many known, active processes as possible, partitioning oil quantities among air, water surface, water column, and the substrate/groundwater systems in or near the surf zone. Eight shoreline types with varying oil holding capacities and seven oil types encompassing a range of viscosities can be simulated. Software Description: The model is written in the FORTRAN 77 programming language for implementation on an IBM PC or compatible microcomputer using the MS-DOS or PC-DOS operating system. An 8087 math coprocessor, a hard disk and 640K RAM are required.

  11. Software development infrastructure for the HYBRID modeling and simulation project

    Energy Technology Data Exchange (ETDEWEB)

    Epiney, Aaron S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Greenwood, M. Scott [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  12. Exponentiated Weibull distribution approach based inflection S-shaped software reliability growth model

    Directory of Open Access Journals (Sweden)

    B.B. Sagar

    2016-09-01

    Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.

  13. Software Development Risk Management Model – A Goal Driven Approach

    OpenAIRE

    Islam, Shareeful

    2009-01-01

    Software development project is often faced with unanticipated problems which pose any potential risks within the development environment. Controlling these risks arises from both the technical and non-technical development components already from the early stages of the development is crucial to arrive at a successful project. Therefore, software development risk management is becoming recognized as a best practice in the software industry for reducing these risks before they occur. This the...

  14. Young women's anterior cruciate ligament injuries: an expanded model and prevention paradigm.

    Science.gov (United States)

    Elliot, Diane L; Goldberg, Linn; Kuehl, Kerry S

    2010-05-01

    Anterior cruciate ligament (ACL) injuries among young female athletes occur at rates three- to eight-times greater than in male competitors and, in general, females experience more sports injuries than males, when balanced for activity and playing time. ACL injuries are a particular concern, as they result in immediate morbidity, high economic costs and may have long-term adverse effects. While several closely monitored ACL injury preventive programmes have been effective, those efforts have not been uniformly protective nor have they achieved widespread use. To date, ACL injury prevention has focused on neuromuscular and anatomical factors without including issues relating more broadly to the athlete. Coincident with greater female sport participation are other influences that may heighten their injury risk. We review those factors, including early single sport specialization, unhealthy dietary behaviours, chronic sleep deprivation and higher levels of fatigue, substance use and abuse, and psychological issues. We augment existing models of ACL injury with these additional dimensions. The proposed expanded injury model has implications for designing injury prevention programmes. High school athletic teams are natural settings for bonded youth and influential coaches to promote healthy lifestyles, as decisions that result in better athletes also promote healthy lifestyles. As an example of how sport teams could be vehicles to address an expanded injury model, we present an existing evidenced-based sport team-centered health promotion and harm reduction programme for female athletes. Widening the lens on factors influencing ACL injury expands the prevention paradigm to combine existing training with activities to promote psychological well-being and a healthy lifestyle. If developed and shown to be effective, those programmes might better reduce injuries and, in addition, provide life skills that would benefit young female athletes both on and off the playing field.

  15. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  16. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.

    2015-01-01

    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  17. Experimental Evaluation of a Serious Game for Teaching Software Process Modeling

    Science.gov (United States)

    Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz

    2015-01-01

    Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…

  18. Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses

    Science.gov (United States)

    Mitra, Sandeep

    2014-01-01

    This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…

  19. Way of Working for Embedded Control Software using Model-Driven Development Techniques

    NARCIS (Netherlands)

    Bezemer, M.M.; Groothuis, M.A.; Brugali, D.; Schlegel, C.; Broenink, Johannes F.

    2011-01-01

    Embedded targets normally do not have much resources to aid developing and debugging the software. So model-driven development (MDD) is used for designing embedded software with a `first time right' approach. For such an approach, a good way of working (WoW) is required for embedded software

  20. A Model for Quality Optimization in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet

    The main objective of software engineers is to design and implement systems that implement all functional and non-functional requirements. Unfortunately, it is very difficult or even generally impossible to deliver a software system that satisfies all the requirements. Even more seriously, failures

  1. Pragmatic quality metrics for evolutionary software development models

    Science.gov (United States)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  2. A multiobjective module-order model for software quality enhancement

    NARCIS (Netherlands)

    Khoshgoftaar, TM; Liu, Y; Seliya, N

    2004-01-01

    The knowledge, prior to system operations, of which program modules are problematic is valuable to a software quality assurance team, especially when there is a constraint on software quality enhancement resources. A cost-effective approach for allocating such resources is to obtain a prediction in

  3. Software Security and the "Building Security in Maturity" Model

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Using the framework described in my book "Software Security: Building Security In" I will discuss and describe the state of the practice in software security. This talk is peppered with real data from the field, based on my work with several large companies as a Cigital consultant. As a discipline, software security has made great progress over the last decade. Of the sixty large-scale software security initiatives we are aware of, thirty-two---all household names---are currently included in the BSIMM study. Those companies among the thirty-two who graciously agreed to be identified include: Adobe, Aon, Bank of America, Capital One, The Depository Trust & Clearing Corporation (DTCC), EMC, Google, Intel, Intuit, McKesson, Microsoft, Nokia, QUALCOMM, Sallie Mae, Standard Life, SWIFT, Symantec, Telecom Italia, Thomson Reuters, VMware, and Wells Fargo. The BSIMM was created by observing and analyzing real-world data from thirty-two leading software security initiatives. The BSIMM can...

  4. Expanding versus non expanding universe

    CERN Document Server

    Alfonso-Faus, Antonio

    2012-01-01

    In cosmology the number of scientists using the framework of an expanding universe is very high. This model, the big-bang, is now overwhelmingly present in almost all aspects of society. It is the main stream cosmology of today. A small number of scientists are researching on the possibility of a non-expanding universe. The existence of these two groups, one very large and the other very small, is a good proof of the use of the scientific method: it does not drive to an absolute certainty. All models have to be permanently validated, falsified. Ockham's razor, a powerful philosophical tool, will probably change the amount of scientists working in each of these groups. We present here a model where a big-bang is unnecessary. It ends, in a finite time, in a second INFLATION, or a disaggregation to infinity. We also discuss the possibilities of a non-expanding universe model. Only a few references will be cited, mainly concerned with our own work in the past, thus purposely avoiding citing the many thousands of ...

  5. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  6. The self-aware diabetic patient software agent model.

    Science.gov (United States)

    Wang, Zhanle; Paranjape, Raman

    2013-11-01

    This work presents a self-aware diabetic patient software agent for representing a human diabetic patient. To develop a 24h, stochastic and self-aware patient agent, we extend the original seminal work of Ackerman et al. [1] in creating a mathematical model of human blood glucose levels in three aspects. (1) We incorporate the stochastic and unpredictable effects of daily living. (2) The Ackerman model is extended into the period of night-time. (3) Patients' awareness of their own conditions is incorporated. Simulation results are quantitatively assessed to demonstrate the effectiveness of lifestyle management, such as adjusting the amount of food consumed, meal schedule, intensity of exercise and level of medication. In this work we show through the simulation that the average blood glucose can be reduced by as much as 51% due to careful lifestyle management. Self monitoring blood glucose is also quantitatively evaluated. The simulation results show that the average blood glucose is further dropped by 25% with the assistance of blood glucose samples. In addition, the blood glucose is perfectly controlled in the target range during the simulation period as a result of joint efforts of lifestyle management and self monitoring blood glucose. This study focuses on demonstrating how human patients' behavior, specifically lifestyle and self monitoring of blood glucose, affects blood glucose controls on a daily basis. This work does not focus on the insulin-glucose interaction of an individual human patient. Our conclusion is that this self-aware patient agent model is capable of adequately representing diabetic patients and of evaluating their dynamic behaviors. It can also be incorporated into a multi-agent system by introducing other healthcare components so that more interesting insights such as the healthcare quality, cost and performance can be observed. © 2013 Published by Elsevier Ltd.

  7. Non-symmetric approach to single-screw expander and compressor modeling

    Science.gov (United States)

    Ziviani, Davide; Groll, Eckhard A.; Braun, James E.; Horton, W. Travis; De Paepe, M.; van den Broek, M.

    2017-08-01

    Single-screw type volumetric machines are employed both as compressors in refrigeration systems and, more recently, as expanders in organic Rankine cycle (ORC) applications. The single-screw machine is characterized by having a central grooved rotor and two mating toothed starwheels that isolate the working chambers. One of the main features of such machine is related to the simultaneous occurrence of the compression or expansion processes on both sides of the main rotor which results in a more balanced loading on the main shaft bearings with respect to twin-screw machines. However, the meshing between starwheels and main rotor is a critical aspect as it heavily affects the volumetric performance of the machine. To allow flow interactions between the two sides of the rotor, a non-symmetric modelling approach has been established to obtain a more comprehensive model of the single-screw machine. The resulting mechanistic model includes in-chamber governing equations, leakage flow models, heat transfer mechanisms, viscous and mechanical losses. Forces and moments balances are used to estimate the loads on the main shaft bearings as well as on the starwheel bearings. An 11 kWe single-screw expander (SSE) adapted from an air compressor operating with R245fa as working fluid is used to validate the model. A total of 60 steady-steady points at four different rotational speeds have been collected to characterize the performance of the machine. The maximum electrical power output and overall isentropic efficiency measured were 7.31 kW and 51.91%, respectively.

  8. A multiple-scales model of the shock-cell structure of imperfectly expanded supersonic jets

    Science.gov (United States)

    Tam, C. K. W.; Jackson, J. A.; Seiner, J. M.

    1985-01-01

    The present investigation is concerned with the development of an analytical model of the quasi-periodic shock-cell structure of an imperfectly expanded supersonic jet. The investigation represents a part of a program to develop a mathematical theory of broadband shock-associated noise of supersonic jets. Tam and Tanna (1982) have suggested that this type of noise is generated by the weak interaction between the quasi-periodic shock cells and the downstream-propagating large turbulence structures in the mixing layer of the jet. In the model developed in this paper, the effect of turbulence in the mixing layer of the jet is simulated by the addition of turbulent eddy-viscosity terms to the momentum equation. Attention is given to the mean-flow profile and the numerical solution, and a comparison of the numerical results with experimental data.

  9. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    Science.gov (United States)

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  10. An Educational Look at an alternative to the Expanding Universe Model

    Science.gov (United States)

    Kriske, Richard

    2009-11-01

    The author often toys with an alternative view to the expanding universe model and believes it would be a good way to teach the Scientific method. In the author's (R.M. Kriske) model the red shift is a result of magnifying the horizon of a 4 dimensional surface. On a two dimensional surface such as the earth the horizon is not maginifiable since things on the surface naturally tilt away from the observer in every direction and everything is transformed into a curved line (the Horizon) (the students can verify this as a globe can be used with some pins in it-for example). Likewise one would expect this signature of curvature to show up on three curved space dimensions, and instead of pins, a perpendicular time dimension. As the observer looks toward the pins they tilt away from him/her and in four dimensions this means they are accelerating away from him/her even though the globe is standing still. At each point a pair is being produced with its attendant gamma ray emission, but the points are of course seen as accelerating away, simply due to the curvature of the globe and nothing else, resulting in a red shift. This author produced model has never been suggested before and never presented to the Scientific community. The students would then need to compare this to the current simpler model that point sources accelerating away from the observer undergo a redshift due to the Doppler Effect. The Students would then have to review these models and determine the size of the globe for the amount of red shift seen from the two competing models. One model has a cut- off mode, since the pins not only tip backward in the curved space model but are also cut off. How does this cut-off show up, is it simply dimming, and can an experiment be done for it? The last step of this exercise is to see if one could tell the difference between these models, and if a mixed model is better, since the Globe could also be expanding (Of course the instructor could also ask what the result

  11. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    Science.gov (United States)

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  12. Prowess–A Software Model for the Ooty Wide Field Array

    Indian Academy of Sciences (India)

    A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a complete software model for the instrument. This model has been implemented through a suite of programs, together called Prowess, which has been conceived with the dual role of an ...

  13. Methods of Modelling Marketing Activity on Software Sales

    Directory of Open Access Journals (Sweden)

    Bashirov Islam H.

    2013-11-01

    Full Text Available The article studies a topical issue of development of methods of modelling marketing activity on software sales for achievement of efficient functioning of an enterprise. On the basis of analysis of the market type for the studied CloudLinux OS product, the article identifies the market structure type: monopolistic competition. To ensure the information basis of the marketing activity in the target market segment, the article offers the survey method. The article provides a questionnaire, which contains specific questions regarding the studied market segment of hosting services, for an online survey with the help of the Survio service. In accordance with the system approach the CloudLinux OS has properties of systems, namely, diversity. Economic differences are non-price indicators that have no numeric expression and are quality descriptions. Analysis of the market and the conducted survey allow obtaining them. Combination of price and non-price indicators provides a complete description of the product properties. To calculate an integral indicator of competitiveness the article offers to apply a model, which is based on the direct algebraic addition of weight measures of individual indicators, regulation of formalised indicators and use of the mechanism of fuzzy sets for identification of non-formalised indicators. The calculated indicator allows not only assessment of the current level of competitiveness, but also identification of influence of changes of various indicators, which allows increase of efficiency of marketing decisions. Also, having identified the target customers of hosting OS and formalised non-price parameters, it is possible to conduct the search for a set of optimal characteristics of the product. In the result an optimal strategy of the product advancement to the market is formed.

  14. Parameter-expanded data augmentation for Bayesian analysis of capture-recapture models

    Science.gov (United States)

    Royle, J. Andrew; Dorazio, Robert M.

    2012-01-01

    Data augmentation (DA) is a flexible tool for analyzing closed and open population models of capture-recapture data, especially models which include sources of hetereogeneity among individuals. The essential concept underlying DA, as we use the term, is based on adding "observations" to create a dataset composed of a known number of individuals. This new (augmented) dataset, which includes the unknown number of individuals N in the population, is then analyzed using a new model that includes a reformulation of the parameter N in the conventional model of the observed (unaugmented) data. In the context of capture-recapture models, we add a set of "all zero" encounter histories which are not, in practice, observable. The model of the augmented dataset is a zero-inflated version of either a binomial or a multinomial base model. Thus, our use of DA provides a general approach for analyzing both closed and open population models of all types. In doing so, this approach provides a unified framework for the analysis of a huge range of models that are treated as unrelated "black boxes" and named procedures in the classical literature. As a practical matter, analysis of the augmented dataset by MCMC is greatly simplified compared to other methods that require specialized algorithms. For example, complex capture-recapture models of an augmented dataset can be fitted with popular MCMC software packages (WinBUGS or JAGS) by providing a concise statement of the model's assumptions that usually involves only a few lines of pseudocode. In this paper, we review the basic technical concepts of data augmentation, and we provide examples of analyses of closed-population models (M 0, M h , distance sampling, and spatial capture-recapture models) and open-population models (Jolly-Seber) with individual effects.

  15. SOFTWARE FOR FAULT DIAGNOSIS USING KNOWLEDGE MODELS IN PETRI NETS

    National Research Council Canada - National Science Library

    ADRIAN ARBOLEDA; GERMAN ZAPATA; JOSÉ VELÁSQUEZ; LUIS MARÍN

    2012-01-01

    ... señales de un proceso que pueden ofrecer dispositivos como controladores lógicos programables (PLCs). Este artículo propone un software novedoso guiado por modelos basados en redes de Petri e integrado con...

  16. The UNIX Operating System: A Model for Software Design.

    Science.gov (United States)

    Kernighan, Brian W.; Morgan, Samuel P.

    1982-01-01

    Describes UNIX time-sharing operating system, including the program environment, software development tools, flexibility and ease of change, portability and other advantages, and five applications and three nonapplications of the system. (JN)

  17. System Engineering Software Assessment Model for Exploration (SESAME) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Concept phase space-systems architecture evaluations typically use mass estimates as the primary means of ranking potential mission architectures. Software does not...

  18. A Model of Software Maintenance for Large Scale Military Systems

    OpenAIRE

    Mostov, Isaak

    1990-01-01

    Approved for public release, distribution unlimited The maintenance of large military software systems is complex, involves users as well as software professionals, and requires appropriate management, which is one of the most important factors in efficient maintenance. Maintenance management requires information about the current state of the maintenance process that should be organized within a maintenance-oriented Enginering Database. This database should include all the necessary dat...

  19. Expanding signaling-molecule wavefront model of cell polarization in the Drosophila wing primordium.

    Science.gov (United States)

    Wortman, Juliana C; Nahmad, Marcos; Zhang, Peng Cheng; Lander, Arthur D; Yu, Clare C

    2017-07-01

    In developing tissues, cell polarization and proliferation are regulated by morphogens and signaling pathways. Cells throughout the Drosophila wing primordium typically show subcellular localization of the unconventional myosin Dachs on the distal side of cells (nearest the center of the disc). Dachs localization depends on the spatial distribution of bonds between the protocadherins Fat (Ft) and Dachsous (Ds), which form heterodimers between adjacent cells; and the Golgi kinase Four-jointed (Fj), which affects the binding affinities of Ft and Ds. The Fj concentration forms a linear gradient while the Ds concentration is roughly uniform throughout most of the wing pouch with a steep transition region that propagates from the center to the edge of the pouch during the third larval instar. Although the Fj gradient is an important cue for polarization, it is unclear how the polarization is affected by cell division and the expanding Ds transition region, both of which can alter the distribution of Ft-Ds heterodimers around the cell periphery. We have developed a computational model to address these questions. In our model, the binding affinity of Ft and Ds depends on phosphorylation by Fj. We assume that the asymmetry of the Ft-Ds bond distribution around the cell periphery defines the polarization, with greater asymmetry promoting cell proliferation. Our model predicts that this asymmetry is greatest in the radially-expanding transition region that leaves polarized cells in its wake. These cells naturally retain their bond distribution asymmetry after division by rapidly replenishing Ft-Ds bonds at new cell-cell interfaces. Thus we predict that the distal localization of Dachs in cells throughout the pouch requires the movement of the Ds transition region and the simple presence, rather than any specific spatial pattern, of Fj.

  20. Stem thrust prediction model for W-K-M double wedge parallel expanding gate valves

    Energy Technology Data Exchange (ETDEWEB)

    Eldiwany, B.; Alvarez, P.D. [Kalsi Engineering Inc., Sugar Land, TX (United States); Wolfe, K. [Electric Power Research Institute, Palo Alto, CA (United States)

    1996-12-01

    An analytical model for determining the required valve stem thrust during opening and closing strokes of W-K-M parallel expanding gate valves was developed as part of the EPRI Motor-Operated Valve Performance Prediction Methodology (EPRI MOV PPM) Program. The model was validated against measured stem thrust data obtained from in-situ testing of three W-K-M valves. Model predictions show favorable, bounding agreement with the measured data for valves with Stellite 6 hardfacing on the disks and seat rings for water flow in the preferred flow direction (gate downstream). The maximum required thrust to open and to close the valve (excluding wedging and unwedging forces) occurs at a slightly open position and not at the fully closed position. In the nonpreferred flow direction, the model shows that premature wedging can occur during {Delta}P closure strokes even when the coefficients of friction at different sliding surfaces are within the typical range. This paper summarizes the model description and comparison against test data.

  1. Deception and Cognitive Load: Expanding our Horizon with a Working Memory Model

    Directory of Open Access Journals (Sweden)

    Siegfried Ludwig Sporer

    2016-04-01

    Full Text Available Deception and Cognitive Load: Expanding our Horizon with a Working Memory ModelAbstractRecently, studies on deception and its detection have increased dramatically. Many of these studies rely on the cognitive load approach as the sole explanatory principle to understand deception. These studies have been exclusively on lies about negative actions (usually lies of suspects of [mock] crimes. Instead, we need to re-focus more generally on the cognitive processes involved in generating both lies and truths, not just on manipulations of cognitive load. Using Baddeley's (2000, 2007, 2012 working memory model, which integrates verbal and visual processes in working memory with retrieval from long-term memory and control of action, not only verbal content cues but also nonverbal, paraverbal and linguistic cues can be investigated within a single framework. The proposed model considers long-term semantic, episodic and autobiographical memory and their connections with working memory and action. It also incorporates ironic processes of mental control (Wegner, 1994, 2009, the role of scripts and schemata and retrieval cues and retrieval processes. Specific predictions of the model are outlined and support from selective studies is presented. The model is applicable to different types of reports, particularly about lies and truths about complex events, and to different modes of production (oral, hand-written, typed. Predictions regarding several moderator variables and methods to investigate them are proposed.

  2. Modeling Large Deformation and Failure of Expanded Polystyrene Crushable Foam Using LS-DYNA

    Directory of Open Access Journals (Sweden)

    Qasim H. Shah

    2014-01-01

    Full Text Available In the initial phase of the research work, quasistatic compression tests were conducted on the expanded polystyrene (EPS crushable foam for material characterisation at low strain rates (8.3×10-3~8.3×10-2 s−1 to obtain the stress strain curves. The resulting stress strain curves are compared well with the ones found in the literature. Numerical analysis of compression tests was carried out to validate them against experimental results. Additionally gravity-driven drop tests were carried out using a long rod projectile with semispherical end that penetrated into the EPS foam block. Long rod projectile drop tests were simulated in LS-DYNA by using suggested parameter enhancements that were able to compute the material damage and failure response precisely. The material parameters adjustment for successful modelling has been reported.

  3. Introduction to Lean Canvas Transformation Models and Metrics in Software Testing

    Directory of Open Access Journals (Sweden)

    Nidagundi Padmaraj

    2016-05-01

    Full Text Available Software plays a key role nowadays in all fields, from simple up to cutting-edge technologies and most of technology devices now work on software. Software development verification and validation have become very important to produce the high quality software according to business stakeholder requirements. Different software development methodologies have given a new dimension for software testing. In traditional waterfall software development software testing has approached the end point and begins with resource planning, a test plan is designed and test criteria are defined for acceptance testing. In this process most of test plan is well documented and it leads towards the time-consuming processes. For the modern software development methodology such as agile where long test processes and documentations are not followed strictly due to small iteration of software development and testing, lean canvas transformation models can be a solution. This paper provides a new dimension to find out the possibilities of adopting the lean transformation models and metrics in the software test plan to simplify the test process for further use of these test metrics on canvas.

  4. Expanding Ambulatory Care Pharmacy Residency Education Through a Multisite University-Affiliated Model.

    Science.gov (United States)

    Schweiss, Sarah K; Westberg, Sarah M; Moon, Jean Y; Sorensen, Todd D

    2017-12-01

    As the health-care system evolves and shifts to value-based payment systems, there is a recognized need to increase the number of ambulatory care trained pharmacists. The objective of this article is to describe the administrative structure of the University of Minnesota Postgraduate Year 1 (PGY1) Pharmacy Residency program and to encourage adoption of similar models nationally in order to expand ambulatory care residency training opportunities and meet the demand for pharmacist practitioners. Program Structure: The University of Minnesota PGY1 Pharmacy Residency program is a multisite program centered on the practice of pharmaceutical care and provision of comprehensive medication management (CMM) services in ambulatory care settings. The centralized administration of a multisite academic-affiliated training model creates efficiency in the administration process, while allowing sites to focus on clinical training. This model also offers many innovative and unique opportunities to residents. A multisite university-affiliated ambulatory care residency training model provides efficiency in program administration, while successfully accelerating the growth of quality ambulatory care residency training and supporting innovative delivery of shared core learning experiences. Consequently, practice sites grow in their service delivery capacity and quality of care.

  5. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    Directory of Open Access Journals (Sweden)

    Maurice H. ter Beek

    2015-04-01

    Full Text Available We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLan with action rates, which specify the likelihood of exhibiting particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLan allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLan semantics based on discrete-time Markov chains. The Maude implementation of PFLan is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning and the expected average cost of products.

  6. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting...... particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLAN) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLAN semantics based on discrete-time Markov chains....... The Maude implementation of PFLAN is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average...

  7. Bringing Model Checking Closer to Practical Software Engineering

    CERN Document Server

    AUTHOR|(CDS)2079681; Templon, J A; Willemse, T.A.C.

    Software grows in size and complexity, making it increasingly challenging to ensure that it behaves correctly. This is especially true for distributed systems, where a multitude of components are running concurrently, making it dicult to anticipate all the possible behaviors emerging in the system as a whole. Certain design errors, such as deadlocks and race-conditions, can often go unnoticed when testing is the only form of verication employed in the software engineering life-cycle. Even when bugs are detected in a running software, revealing the root cause and reproducing the behavior can be time consuming (and even impossible), given the lack of control the engineer has over the execution of the concurrent components, as well as the number of possible scenarios that could have produced the problem. This is especially pronounced for large-scale distributed systems such as the Worldwide Large Hadron Collider Computing Grid. Formal verication methods oer more rigorous means of determining whether a system sat...

  8. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Energy Technology Data Exchange (ETDEWEB)

    Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Kelsey, Michael [SLAC; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Wright, Dennis H. [SLAC; Yarba, Julia [Fermilab

    2017-02-17

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  9. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  10. Executable Behavioral Modeling of System and Software Architecture Specifications to Inform Resourcing Decisions

    Science.gov (United States)

    2016-09-01

    BEHAVIORAL MODELING OF SYSTEM- AND SOFTWARE- ARCHITECTURE SPECIFICATIONS TO INFORM RESOURCING DECISIONS by Monica F. Farah-Stapleton...AND SOFTWARE- ARCHITECTURE SPECIFICATIONS TO INFORM RESOURCING DECISIONS 5. FUNDING NUMBERS 6. AUTHOR(S) Monica F. Farah-Stapleton 7. PERFORMING...intellectual, programmatic, and organizational resources. Precise behavioral modeling offers a way to assess architectural design decisions prior to

  11. A Systematic Evaluation of Enterprise Modelling Approaches on Their Applicability to Automatically Generate ERP Software

    NARCIS (Netherlands)

    Schunselaar, D. M M; Gulden, J.; van Schuur, W.H.; Reijers, H. A.

    2016-01-01

    Customising Enterprise Resource Planning (ERP) software to the enterprise's needs is still a technical endeavour often involving enabling/disabling modules, modifying configuration files, etc. This is quite surprising given the large body of work on Enterprise Modelling and Model-Driven Software

  12. Quantitative hardware prediction modeling for hardware/software co-design

    NARCIS (Netherlands)

    Meeuws, R.J.

    2012-01-01

    Hardware estimation is an important factor in Hardware/Software Co-design. In this dissertation, we present the Quipu Modeling Approach, a high-level quantitative prediction model for HW/SW Partitioning using statistical methods. Our approach uses linear regression between software complexity

  13. Open source software engineering for geoscientific modeling applications

    Science.gov (United States)

    Bilke, L.; Rink, K.; Fischer, T.; Kolditz, O.

    2012-12-01

    OpenGeoSys (OGS) is a scientific open source project for numerical simulation of thermo-hydro-mechanical-chemical (THMC) processes in porous and fractured media. The OGS software development community is distributed all over the world and people with different backgrounds are contributing code to a complex software system. The following points have to be addressed for successful software development: - Platform independent code - A unified build system - A version control system - A collaborative project web site - Continuous builds and testing - Providing binaries and documentation for end users OGS should run on a PC as well as on a computing cluster regardless of the operating system. Therefore the code should not include any platform specific feature or library. Instead open source and platform independent libraries like Qt for the graphical user interface or VTK for visualization algorithms are used. A source code management and version control system is a definite requirement for distributed software development. For this purpose Git is used, which enables developers to work on separate versions (branches) of the software and to merge those versions at some point to the official one. The version control system is integrated into an information and collaboration website based on a wiki system. The wiki is used for collecting information such as tutorials, application examples and case studies. Discussions take place in the OGS mailing list. To improve code stability and to verify code correctness a continuous build and testing system, based on the Jenkins Continuous Integration Server, has been established. This server is connected to the version control system and does the following on every code change: - Compiles (builds) the code on every supported platform (Linux, Windows, MacOS) - Runs a comprehensive test suite of over 120 benchmarks and verifies the results Runs software development related metrics on the code (like compiler warnings, code complexity

  14. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  15. Abstract delta modeling : software product lines and beyond

    NARCIS (Netherlands)

    Helvensteijn, Michiel

    2014-01-01

    To prevent a large software system from collapsing under its own complexity, its code needs to be well-structured. Ideally we want all code related to a certain feature to be grouped together —called feature modularization— and code belonging to different features not to mix — called separation of

  16. A multiphysics and multiscale software environment for modeling astrophysical systems

    NARCIS (Netherlands)

    Portegies Zwart, S.; McMillan, S.; Harfst, S.; Groen, D.; Fujii, M.; Ó Nualláin, B.; Glebbeek, E.; Heggie, D.; Lombardi, J.; Hut, P.; Angelou, V.; Banerjee, S.; Belkus, H.; Fragos, T.; Fregeau, J.; Gaburov, E.; Izzard, R.; Jurić, M.; Justham, S.; Sottoriva, A.; Teuben, P.; van Bever, J.; Yaron, O.; Zemp, M.

    2009-01-01

    We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying

  17. Constraint driven software design: an escape from the waterfall model

    NARCIS (Netherlands)

    de Hoog, Robert; de Jong, Anthonius J.M.; de Vries, Frits

    1994-01-01

    This paper presents the principles of a development methodology for software design. The methodology is based on a nonlinear, product-driven approach that integrates quality aspects. The principles are made more concrete in two examples: one for developing educational simulations and one for

  18. Software Tools For Large Scale Interactive Hydrodynamic Modeling

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; van Dam, A; Jagers, B; van der Pijl, S.; Piasecki, M.

    2014-01-01

    Developing easy-to-use software that combines components for simultaneous visualization, simulation and interaction is a great challenge. Mainly, because it involves a number of disciplines, like computational fluid dynamics, computer graphics, high-performance computing. One of the main

  19. Revenue Management and Demand Fulfillment: Matching Applications, Models, and Software

    NARCIS (Netherlands)

    R. Quante (Rainer); H. Meyr (Herbert); M. Fleischmann (Moritz)

    2007-01-01

    textabstractRecent years have seen great successes of revenue management, notably in the airline, hotel, and car rental business. Currently, an increasing number of industries, including manufacturers and retailers, are exploring ways to adopt similar concepts. Software companies are taking an

  20. Expanding the role of reactive transport models in critical zone processes

    Science.gov (United States)

    Li, Li; Maher, Kate; Navarre-Sitchler, Alexis; Druhan, Jennifer; Meile, Christof; Lawrence, Corey; Moore, Joel; Perdrial, Julia; Sullivan, Pamela; Thompson, Aaron; Jin, Lixin; Bolton, Edward W.; Brantley, Susan L.; Dietrich, William E.; Mayer, K. Ulrich; Steefel, Carl; Valocchi, Albert J.; Zachara, John M.; Kocar, Benjamin D.; McIntosh, Jennifer; Tutolo, Benjamin M.; Kumar, Mukesh; Sonnenthal, Eric; Bao, Chen; Beisman, Joe

    2017-01-01

    Models test our understanding of processes and can reach beyond the spatial and temporal scales of measurements. Multi-component Reactive Transport Models (RTMs), initially developed more than three decades ago, have been used extensively to explore the interactions of geothermal, hydrologic, geochemical, and geobiological processes in subsurface systems. Driven by extensive data sets now available from intensive measurement efforts, there is a pressing need to couple RTMs with other community models to explore non-linear interactions among the atmosphere, hydrosphere, biosphere, and geosphere. Here we briefly review the history of RTM development, summarize the current state of RTM approaches, and identify new research directions, opportunities, and infrastructure needs to broaden the use of RTMs. In particular, we envision the expanded use of RTMs in advancing process understanding in the Critical Zone, the veneer of the Earth that extends from the top of vegetation to the bottom of groundwater. We argue that, although parsimonious models are essential at larger scales, process-based models offer tools to explore the highly nonlinear coupling that characterizes natural systems. We present seven testable hypotheses that emphasize the unique capabilities of process-based RTMs for (1) elucidating chemical weathering and its physical and biogeochemical drivers; (2) understanding the interactions among roots, micro-organisms, carbon, water, and minerals in the rhizosphere; (3) assessing the effects of heterogeneity across spatial and temporal scales; and (4) integrating the vast quantity of novel data, including “omics” data (genomics, transcriptomics, proteomics, metabolomics), elemental concentration and speciation data, and isotope data into our understanding of complex earth surface systems. With strong support from data-driven sciences, we are now in an exciting era where integration of RTM framework into other community models will facilitate process

  1. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

    Directory of Open Access Journals (Sweden)

    Михаил Юрьевич Чернышов

    2013-12-01

    Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

  2. Embedding the concept of service oriented architecture into software sustainability evaluation model

    Science.gov (United States)

    Ahmad, Ruzita; Hussain, Azham; Baharom, Fauziah

    2017-10-01

    Software sustainability evaluation is a measurement mechanism which involved several criteria of software development through the characteristic and sub-characteristic with requirement to meet the needs at the present until to the future generation. The measurement mechanism can support to achieve developing software towards sustainability perspective such as environment, economic and social. This paper embedded the concept of Service-Oriented Architecture into sustainability evaluation model to support the measurement criteria in the way to build software flexibility, reusability and agility. The objective is to propose several characteristic of software development with utilizing the concept of sustainability and embedded with SOA concept. The mapping criteria of SOA and software development characteristic significantly improve the measurement criteria that can be addressed in the measurement model.

  3. Continuous-time random-walk model for anomalous diffusion in expanding media

    Science.gov (United States)

    Le Vot, F.; Abad, E.; Yuste, S. B.

    2017-09-01

    Expanding media are typical in many different fields, e.g., in biology and cosmology. In general, a medium expansion (contraction) brings about dramatic changes in the behavior of diffusive transport properties such as the set of positional moments and the Green's function. Here, we focus on the characterization of such effects when the diffusion process is described by the continuous-time random-walk (CTRW) model. As is well known, when the medium is static this model yields anomalous diffusion for a proper choice of the probability density function (pdf) for the jump length and the waiting time, but the behavior may change drastically if a medium expansion is superimposed on the intrinsic random motion of the diffusing particle. For the case where the jump length and the waiting time pdfs are long-tailed, we derive a general bifractional diffusion equation which reduces to a normal diffusion equation in the appropriate limit. We then study some particular cases of interest, including Lévy flights and subdiffusive CTRWs. In the former case, we find an analytical exact solution for the Green's function (propagator). When the expansion is sufficiently fast, the contribution of the diffusive transport becomes irrelevant at long times and the propagator tends to a stationary profile in the comoving reference frame. In contrast, for a contracting medium a competition between the spreading effect of diffusion and the concentrating effect of contraction arises. In the specific case of a subdiffusive CTRW in an exponentially contracting medium, the latter effect prevails for sufficiently long times, and all the particles are eventually localized at a single point in physical space. This "big crunch" effect, totally absent in the case of normal diffusion, stems from inefficient particle spreading due to subdiffusion. We also derive a hierarchy of differential equations for the moments of the transport process described by the subdiffusive CTRW model in an expanding medium

  4. 2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Carrington, David Bradley [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Waters, Jiajia [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-25

    Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.

  5. Comparison of the Two-Hemisphere Model-Driven Approach to Other Methods for Model-Driven Software Development

    Directory of Open Access Journals (Sweden)

    Nikiforova Oksana

    2015-12-01

    Full Text Available Models are widely used not only in computer science field, but also in other fields. They are an effective way to show relevant information in a convenient way. Model-driven software development uses models and transformations as first-class citizens. That makes software development phases more related to each other, those links later help to make changes or modify software product more freely. At the moment there are a lot of methods and techniques to create those models and transform them into each other. Since 2004, authors have been developing the so called 2HMD approach to bridge the gap between problem domain and software components by using models and model transformation. The goal of this research is to compare different methods positioned for performing the same tasks as the 2HMD approach and to understand the state of the art in the area of model-driven software development.

  6. Software Development Cost and Time Forecasting Using a High Performance Artificial Neural Network Model

    Science.gov (United States)

    Attarzadeh, Iman; Ow, Siew Hock

    Nowadays, mature software companies are more interested to have a precise estimation of software metrics such as project time, cost, quality, and risk at the early stages of software development process. The ability to precisely estimate project time and costs by project managers is one of the essential tasks in software development activities, and it named software effort estimation. The estimated effort at the early stage of project development process is uncertain, vague, and often the least accurate. It is because that very little information is available at the beginning stage of project. Therefore, a reliable and precise effort estimation model is an ongoing challenge for project managers and software engineers. This research work proposes a novel soft computing model incorporating Constructive Cost Model (COCOMO) to improve the precision of software time and cost estimation. The proposed artificial neural network model has good generalisation, adaption capability, and it can be interpreted and validated by software engineers. The experimental results show that applying the desirable features of artificial neural networks on the algorithmic estimation model improves the accuracy of time and cost estimation and estimated effort can be very close to the actual effort.

  7. The Use of Modeling for Flight Software Engineering on SMAP

    Science.gov (United States)

    Murray, Alexander; Jones, Chris G.; Reder, Leonard; Cheng, Shang-Wen

    2011-01-01

    The Soil Moisture Active Passive (SMAP) mission proposes to deploy an Earth-orbiting satellite with the goal of obtaining global maps of soil moisture content at regular intervals. Launch is currently planned in 2014. The spacecraft bus would be built at the Jet Propulsion Laboratory (JPL), incorporating both new avionics as well as hardware and software heritage from other JPL projects. [4] provides a comprehensive overview of the proposed mission

  8. The Integration of Architectural Design and Energy Modelling Software

    OpenAIRE

    Hetherington, Robina

    2013-01-01

    Intelligent and integrated architectural design can substantially reduce carbon dioxide emissions from energy used in buildings. However, architects need new tools to help them to design enjoyable, comfortable, attractive and yet technically rigorous, low energy buildings. This thesis investigates, by means of a Research Through Design approach, how architectural software could be better designed to fulfil this need by the integration of design, energy simulation and decision support systems....

  9. Expanding the developmental models of writing: A direct and indirect effects model of developmental writing (DIEW)

    Science.gov (United States)

    Kim, Young-Suk Grace; Schatschneider, Christopher

    2016-01-01

    We investigated direct and indirect effects of component skills on writing (DIEW) using data from 193 children in Grade 1. In this model, working memory was hypothesized to be a foundational cognitive ability for language and cognitive skills as well as transcription skills, which, in turn, contribute to writing. Foundational oral language skills (vocabulary and grammatical knowledge) and higher-order cognitive skills (inference and theory of mind) were hypothesized to be component skills of text generation (i.e., discourse-level oral language). Results from structural equation modeling largely supported a complete mediation model among four variations of the DIEW model. Discourse-level oral language, spelling, and handwriting fluency completely mediated the relations of higher-order cognitive skills, foundational oral language, and working memory to writing. Moreover, language and cognitive skills had both direct and indirect relations to discourse-level oral language. Total effects, including direct and indirect effects, were substantial for discourse-level oral language (.46), working memory (.43), and spelling (.37), followed by vocabulary (.19), handwriting (.17), theory of mind (.12), inference (.10), and grammatical knowledge (.10). The model explained approximately 67% of variance in writing quality. These results indicate that multiple language and cognitive skills make direct and indirect contributions, and it is important to consider both direct and indirect pathways of influences when considering skills that are important to writing. PMID:28260812

  10. Expanding the Developmental Models of Writing: A Direct and Indirect Effects Model of Developmental Writing (DIEW)

    Science.gov (United States)

    Kim, Young-Suk Grace; Schatschneider, Christopher

    2017-01-01

    We investigated direct and indirect effects of component skills on writing (DIEW) using data from 193 children in Grade 1. In this model, working memory was hypothesized to be a foundational cognitive ability for language and cognitive skills as well as transcription skills, which, in turn, contribute to writing. Foundational oral language skills…

  11. A Technology-Neutral Role-Based Collaboration Model for Software Ecosystems

    DEFF Research Database (Denmark)

    Stanciulescu, Stefan; Rabiser, Daniela; Seidl, Christoph

    2016-01-01

    by contributing a role-based collaboration model for software ecosystems to make such implicit similarities explicit and to raise awareness among developers during their ongoing efforts. We extract this model based on realization artifacts in a specific programming language located in a particular source code...... repository and present it in a technology-neutral way. We capture five essential collaborations as independent role models that may be composed to present developer collaborations of a software ecosystem in their entirety, which fosters overview of the software ecosystem, analyses of duplicated development......In large-scale software ecosystems, many developers contribute extensions to a common software platform. Due to the independent development efforts and the lack of a central steering mechanism, similar functionality may be developed multiple times by different developers. We tackle this problem...

  12. Introduction of new road pavement response modelling software by means of benchmarking

    CSIR Research Space (South Africa)

    Maina, JW

    2008-07-01

    Full Text Available Pavement Analysis and Design Software (ME-PADS v1.1), the previous ELSYM5 analysis engine was replaced by General Analysis of Multi-layered Elastic Systems (GAMES) software. Among the advantages of GAMES are the ability to model multiple pavement layers...

  13. A Unified Component Modeling Approach for Performance Estimation in Hardware/Software Codesign

    DEFF Research Database (Denmark)

    Grode, Jesper Nicolai Riis; Madsen, Jan

    1998-01-01

    This paper presents an approach for abstract modeling of hardware/software architectures using Hierarchical Colored Petri Nets. The approach is able to capture complex behavioral characteristics often seen in software and hardware architectures, thus it is suitable for high level codesign issues...

  14. Specification and Generation of Environment for Model Checking of Software Components

    Czech Academy of Sciences Publication Activity Database

    Pařízek, P.; Plášil, František

    2007-01-01

    Roč. 176, - (2007), s. 143-154 ISSN 1571-0661 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  15. Healthy Community and Healthy Commons: ‘Opensourcing’ as a Sustainable Model of Software Production

    Directory of Open Access Journals (Sweden)

    Damrongsak Naparat

    2015-11-01

    Full Text Available Many commercial software firms rely on opensourcing as a viable model of software production. Opensourcing is a specific form of interaction between firms and open source software (OSS communities for collaboratively producing software. The existing literature has identified opensourcing as a viable form of software production, which could be a substitute for “in-house” or “outsourced” software development. However, little is known about how opensourcing works or is sustained in the long term. The objective of this research is to explain the factors affecting the sustainability of opensourcing as a model of software production. The study employs a single case study of hospital software in Thailand to understand how firms and the communities can live symbiotically and sustain their collaboration to peer-produce vertical domain software. The analysis reveals six mechanisms (positive experience, trust in the leadership of the project leader, the demonstration of reciprocity, marketing the community, enriching knowledge, and face-to-face meetings and demonstrates how they operate in conjunction with each other to sustain opensourcing.

  16. Disordered eating among Asian American college women: A racially expanded model of objectification theory.

    Science.gov (United States)

    Cheng, Hsiu-Lan; Tran, Alisia G T T; Miyake, Elisa R; Kim, Helen Youngju

    2017-03-01

    Objectification theory has been applied to understand disordered eating among college women. A recent extension of objectification theory (Moradi, 2010) conceptualizes racism as a socialization experience that shapes women of color's objectification experiences, yet limited research has examined this theoretical assertion. The present study proposed and examined a racially expanded model of objectification theory that postulated perceived racial discrimination, perpetual foreigner racism, and racial/ethnic teasing as correlates of Asian American college women's (N = 516) self-objectification processes and eating disorder symptomatology. Perceived racial discrimination, perpetual foreigner racism, and racial/ethnic teasing were indirectly associated with eating disordered symptomatology through self-objectification processes of internalization of media ideals of beauty (media internalization), body surveillance, and body shame. Results support the inclusion of racial stressors as contexts of objectification for Asian American women. The present findings also underscore perceived racial discrimination, racial/ethnic teasing, and perpetual foreigner racism as group-specific risk factors with major theoretical, empirical, and clinical relevance to eating disorder research and treatment with Asian American college women. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Can we clinically recognize a vascular depression? The role of personality in an expanded threshold model.

    Science.gov (United States)

    Turk, Bela R; Gschwandtner, Michael E; Mauerhofer, Michaela; Löffler-Stastka, Henriette

    2015-05-01

    The vascular depression (VD) hypothesis postulates that cerebrovascular disease may "predispose, precipitate, or perpetuate" a depressive syndrome in elderly patients. Clinical presentation of VD has been shown to differ to major depression in quantitative disability; however, as little research has been made toward qualitative phenomenological differences in the personality aspects of the symptom profile, clinical diagnosis remains a challenge.We attempted to identify differences in clinical presentation between depression patients (n = 50) with (n = 25) and without (n = 25) vascular disease using questionnaires to assess depression, affect regulation, object relations, aggressiveness, alexithymia, personality functioning, personality traits, and counter transference.We were able to show that patients with vascular dysfunction and depression exhibit significantly higher aggressive and auto-aggressive tendencies due to a lower tolerance threshold. These data indicate that VD is a separate clinical entity and secondly that the role of personality itself may be a component of the disease process. We propose an expanded threshold disease model incorporating personality functioning and mood changes. Such findings might also aid the development of a screening program, by serving as differential criteria, ameliorating the diagnostic procedure.

  18. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  19. Software para modelagem de dispersão de efluentes em rios Software for modeling effluent emissions in rivers

    Directory of Open Access Journals (Sweden)

    Márcio Bezerra Machado

    2008-09-01

    Full Text Available Este trabalho apresenta um modelo Fluidodinâmico Computacional tridimensional para simular a dispersão de substâncias solúveis em rios. O modelo pode predizer o impacto causado pela ocorrência de múltiplos pontos de emissão no trecho estudado. O código numérico para o modelo matemático foi desenvolvido em linguagem Fortran. Os resultados mostram que a metodologia proposta é uma boa ferramenta para a avaliação do impacto ambiental causado pela emissão de efluentes em rios. O software é bastante rápido, especialmente quando comparado com outros pacotes de CFD disponíveis comercialmente. Foram feitas comparações entre os resultados numéricos e dados experimentais coletados no rio Atibaia. Os resultados numéricos apresentaram uma boa concordância com os dados coletados experimentalmente.This work presents a three-dimensional model for the dispersion of effluents in rivers using Computational Fluid Dynamics (CFD techniques. There are several models in the literature, some of which even analyze complex flows. They are however restricted to small river sections. The main contribution of this work is that it proposes a new software capable of predicting the dispersion of effluents in very large open channels. The model is very fast, an unusual feature of CFD models. Due to this, it is possible to predict the dispersion of substances in long sections of rivers with some kilometers in extension. Moreover, multiple emissions can be analyzed by the model, allowing its use as a predictive tool to analyze and guide management decisions on future industrial installations near rivers. Results for the dispersion of an inert emission in a river near Campinas (Brazil were used to validate the model.

  20. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    Science.gov (United States)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  1. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  2. Expanding the Early and Late Starter Model of Criminal Justice Involvement for Forensic Mental Health Clients.

    Science.gov (United States)

    Crocker, Anne G; Martin, Michael S; Leclair, Marichelle C; Nicholls, Tonia L; Seto, Michael C

    2017-11-27

    The early and late starter model provides one of the most enduring frameworks for understanding the developmental course and severity of violence and criminality among individuals with severe mental illness. We expanded the model to account for differences in the age of onset of criminal behavior and added a group with no prior contact with the justice or mental health systems. We sampled 1,800 men and women found Not Criminally Responsible on account of Mental Disorder in 3 Canadian provinces. Using a retrospective file-based study, we explored differences in criminal, health, demographic, and social functioning characteristics, processing through the forensic psychiatric system and recidivism outcomes of 5 groups. We replicated prior research, finding more typical criminogenic needs among those with early onset crime. Those with crime onset after mental illness were more likely to show fewer criminogenic needs and to have better outcomes upon release than those who had crime onset during adulthood, before mental illness. Individuals with no prior contact with mental health or criminal justice had higher functioning prior to their crime and had a lower risk of reoffending. Given little information is needed to identify the groups, computing the distribution of these groups within forensic mental health services or across services can provide estimates of potential intensity or duration of services that might be needed. This study suggests that distinguishing subgroups of forensic clients based on the sequence of onset of mental illness and criminal behavior and on the age of onset of criminal behavior may be useful to identify criminogenic needs and predict outcomes upon release. This updated framework can be useful for planning organization of services, understanding case mix, as well as patient flow in forensic services and flow of mentally disordered offenders in correctional services. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Modeling of adsorption of toxic chromium on natural and surface modified lightweight expanded clay aggregate (LECA)

    Energy Technology Data Exchange (ETDEWEB)

    Kalhori, Ebrahim Mohammadi, E-mail: zarrabi62@yahoo.com [Department of Environmental Health Engineering, Faculty of Health, Alborz University of Medical Sciences, P.O. Box No: 31485/561, Alborz, Karaj (Iran, Islamic Republic of); Yetilmezsoy, Kaan, E-mail: yetilmez@yildiz.edu.tr [Department of Environmental Engineering, Faculty of Civil Engineering, Yildiz Technical University, 34220 Davutpasa, Esenler, Istanbul (Turkey); Uygur, Nihan, E-mail: uygur.n@gmail.com [Department of Environmental Engineering, Faculty of Engineering, Adiyaman University, 02040 Altinsehir, Adiyaman (Turkey); Zarrabi, Mansur, E-mail: mansor62@gmail.com [Department of Environmental Health Engineering, Faculty of Health, Alborz University of Medical Sciences, P.O. Box No: 31485/561, Alborz, Karaj (Iran, Islamic Republic of); Shmeis, Reham M. Abu, E-mail: r.abushmeis@yahoo.com [Department of Basic Pharmaceutical Sciences, Faculty of Pharmacy, Isra University, PO Box 140753, code 11814, Amman (Jordan)

    2013-12-15

    Lightweight Expanded Clay Aggregate (LECA) modified with an aqueous solution of magnesium chloride MgCl{sub 2} and hydrogen peroxide H{sub 2}O{sub 2} was used to remove Cr(VI) from aqueous solutions. The adsorption properties of the used adsorbents were investigated through batch studies, Scanning Electron Microscopy (SEM), X-ray Diffraction (XRD), X-ray Fluorescence Spectroscopy (XRF), and Fourier Transform Infrared (FTIR) spectroscopy. The effect created by magnesium chloride on the modification of the LECA surface was greater than that of hydrogen peroxide solution and showed a substantial increase in the specific surface area which has a value of 76.12 m{sup 2}/g for magnesium chloride modified LECA while the values of 53.72 m{sup 2}/g, and 11.53 m{sup 2}/g were found for hydrogen peroxide modified LECA and natural LECA, respectively. The extent of surface modification with enhanced porosity in modified LECA was apparent from the recorded SEM patterns. XRD and FTIR studies of themodified LECA surface did not show any structural distortion. The adsorption kinetics was found to follow the modified Freundlich kinetic model and the equilibrium data fitted the Sips and Dubinin-Radushkevich equations better than other models. Maximum sorption capacities were found to be 198.39, 218.29 and 236.24 mg/g for natural LECA, surface modified LECA with H{sub 2}O{sub 2} and surface modified LECA with MgCl{sub 2}, respectively. Adsorbents were found to have only a weak effect on conductivity and turbidity of aqueous solutions. Spent natural and surface modified LECA with MgCl{sub 2} was best regenerated with HCl solution, while LECA surface modified with H{sub 2}O{sub 2} was best regenerated with HNO{sub 3} concentrated solution. Thermal method showed a lower regeneration percentage for all spent adsorbents.

  4. A testing-coverage software reliability model considering fault removal efficiency and error generation

    National Research Council Canada - National Science Library

    Qiuying Li; Hoang Pham

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP...

  5. Validation of mission critical software design and implementation using model checking

    Science.gov (United States)

    Pingree, P. J.; Mikk, E.; Holzmann, G.; Smith, M.; Dams, D.

    2002-01-01

    Model Checking conducts an exhaustive exploration of all possible behaviors of a software system design and as such can be used to detect defects in designs that are typically difficult to discover with conventional testing approaches.

  6. Model-Checking of Component-Based Event-Driven Real-Time Embedded Software

    National Research Council Canada - National Science Library

    Gu, Zonghua; Shin, Kang G

    2005-01-01

    .... We discuss application of model-checking to verify system-level concurrency properties of component-based real-time embedded software based on CORBA Event Service, using Avionics Mission Computing...

  7. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  8. Dynamic modeling and simulation of sheave damper based on AMESim software

    National Research Council Canada - National Science Library

    BI Ke; LI Xiang; TANG Zhiyin; OUYANG Bin; HE Haitao; WANG Qi; WU Gang

    2017-01-01

    ...] this paper presents a sheave damper with variable damping according to piston displacement as a replacement for the traditional sheave damper, and AMESim software is used for the modeling and simulation.[Results...

  9. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model

    Science.gov (United States)

    Saul, Katherine R.; Hu, Xiao; Goehler, Craig M.; Vidt, Meghan E.; Daly, Melissa; Velisar, Anca; Murray, Wendy M.

    2014-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms. PMID:24995410

  10. GIS based model interfacing : incorporating existing software and new techniques into a streamlined interface package

    Science.gov (United States)

    2000-01-01

    The ability to visualize data has grown immensely as the speed and functionality of Geographic Information Systems (GIS) have increased. Now, with modeling software and GIS, planners are able to view a prediction of the future traffic demands in thei...

  11. Software Infrastructure to Enable Modeling & Simulation as a Service (M&SaaS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase 2 project will produce a software service infrastructure that enables most modeling and simulation (M&S) activities from code development and...

  12. The role of reliability graph models in assuring dependable operation of complex hardware/software systems

    Science.gov (United States)

    Patterson-Hine, F. A.; Davis, Gloria J.; Pedar, A.

    1991-01-01

    The complexity of computer systems currently being designed for critical applications in the scientific, commercial, and military arenas requires the development of new techniques for utilizing models of system behavior in order to assure 'ultra-dependability'. The complexity of these systems, such as Space Station Freedom and the Air Traffic Control System, stems from their highly integrated designs containing both hardware and software as critical components. Reliability graph models, such as fault trees and digraphs, are used frequently to model hardware systems. Their applicability for software systems has also been demonstrated for software safety analysis and the analysis of software fault tolerance. This paper discusses further uses of graph models in the design and implementation of fault management systems for safety critical applications.

  13. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. Presented at the IADIS m-learning 2008 Conference. April, 11-13, 2008, Carvoeiro, Portugal.

  14. An Approach for the Implementation of Software Quality Models Adpoting CERTICS and CMMI-DEV

    Directory of Open Access Journals (Sweden)

    GARCIA, F.W.

    2015-12-01

    Full Text Available This paper proposes a mapping between two product quality and software processes models used in the industry, the CERTICS national model and the CMMI-DEV international model. The stages of mapping are presented step by step, as well as the mapping review, which had the cooperation of one specialist in CERTICS and CMMI-DEV models. It aims to correlate the structures of the two models in order to facilitate and reduce the implementation time and costs, and to stimulate the execution of multi-model implementations in software developers companies.

  15. Reliability modeling of digital RPS with consideration of undetected software faults

    Energy Technology Data Exchange (ETDEWEB)

    Khalaquzzaman, M.; Lee, Seung Jun; Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Man Cheol [Chung Ang Univ., Seoul (Korea, Republic of)

    2013-10-15

    This paper provides overview of different software reliability methodologies and proposes a technic for estimating the reliability of RPS with consideration of undetected software faults. Software reliability analysis of safety critical software has been challenging despite spending a huge effort for developing large number of software reliability models, and no consensus yet to attain on an appropriate modeling methodology. However, it is realized that the combined application of BBN based SDLC fault prediction method and random black-box testing of software would provide better ground for reliability estimation of safety critical software. Digitalizing the reactor protection system of nuclear power plant has been initiated several decades ago and now full digitalization has been adopted in the new generation of NPPs around the world because digital I and C systems have many better technical features like easier configurability and maintainability over analog I and C systems. Digital I and C systems are also drift-free and incorporation of new features is much easier. Rules and regulation for safe operation of NPPs are established and has been being practiced by the operators as well as regulators of NPPs to ensure safety. The failure mechanism of hardware and analog systems well understood and the risk analysis methods for these components and systems are well established. However, digitalization of I and C system in NPP introduces some crisis and uncertainty in reliability analysis methods of the digital systems/components because software failure mechanisms are still unclear.

  16. A software development and evolution model based on decision-making

    Science.gov (United States)

    Wild, J. Christian; Dong, Jinghuan; Maly, Kurt

    1991-01-01

    Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.

  17. A Novel Approach to Modeling Tunnel Junction Diodes Using Silvaco Atlas Software

    Science.gov (United States)

    2005-12-01

    MODELING TUNNEL JUNCTION DIODES USING SILVACO ATLAS SOFTWARE by Robert Gelinas December 2005 Thesis Advisor: Sherif Michael Second...Junction Diodes using Silvaco Atlas Software 6. AUTHOR(S) Robert J Gelinas 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES...ability to model a tunnel junction device using the ATLAS device simulator by Silvaco International. The tunnel junction is a critical component of a

  18. pyLIMA: An Open-source Package for Microlensing Modeling. I. Presentation of the Software and Analysis of Single-lens Models

    Science.gov (United States)

    Bachelet, E.; Norbury, M.; Bozza, V.; Street, R.

    2017-11-01

    Microlensing is a unique tool, capable of detecting the “cold” planets between ˜1 and 10 au from their host stars and even unbound “free-floating” planets. This regime has been poorly sampled to date owing to the limitations of alternative planet-finding methods, but a watershed in discoveries is anticipated in the near future thanks to the planned microlensing surveys of WFIRST-AFTA and Euclid's Extended Mission. Of the many challenges inherent in these missions, the modeling of microlensing events will be of primary importance, yet it is often time-consuming, complex, and perceived as a daunting barrier to participation in the field. The large scale of future survey data products will require thorough but efficient modeling software, but, unlike other areas of exoplanet research, microlensing currently lacks a publicly available, well-documented package to conduct this type of analysis. We present version 1.0 of the python Lightcurve Identification and Microlensing Analysis (pyLIMA). This software is written in Python and uses existing packages as much as possible to make it widely accessible. In this paper, we describe the overall architecture of the software and the core modules for modeling single-lens events. To verify the performance of this software, we use it to model both real data sets from events published in the literature and generated test data produced using pyLIMA's simulation module. The results demonstrate that pyLIMA is an efficient tool for microlensing modeling. We will expand pyLIMA to consider more complex phenomena in the following papers.

  19. Functionality Evaluation of a Novel Smart Expandable Pedicle Screw to Mitigate Osteoporosis Effect in Bone Fixation: Modeling and Experimentation

    Directory of Open Access Journals (Sweden)

    Ahmadreza Eshghinejad

    2013-01-01

    Full Text Available This paper proposes a novel expandable-retractable pedicle screw and analyzes its functionality. A specially designed pedicle screw is described which has the ability to expand and retract using nitinol elements. The screw is designed to expand in body temperature and retract by cooling the screw. This expansion-retraction function is verified in an experiment designed in larger scale using a nitinol antagonistic assembly. The results of this experiment are compared to the results of a finite element model developed in Abaqus in combination with a user material subroutine (UMAT. This code has been developed to analyze the nonlinear thermomechanical behavior of shape memory alloy materials. The functionality of the proposed screw is evaluated with simulation and experimentation in a pullout test as well. The pullout force of a normal screw inserted in a normal bone was simulated, and the result is compared with the results of the expandable screw in osteoporotic bone. Lastly, strength of the designed pedicle screw in a foam block is also verified with experiment. The reported finite element simulations and experiments are the proof for the concept of nitinol expandable-retractable elements on a pedicle screw which validate the functionality in a pullout test.

  20. Multi-criteria decision making approach for the selection of software effort estimation model

    Directory of Open Access Journals (Sweden)

    Ashu Bansal

    2017-06-01

    Full Text Available Software development with minimum effort has become a challenging task for the software de-velopers. Software effort may be defined as the prediction process of the effort required to de-velop any software. Many software effort estimation models have been developed in the past, but it is observed that none of them can be applied successfully in all kinds of projects in differ-ent environments that raise the problem of the software effort estimation model selection. To se-lect the suitable software effort estimation model, many conflicting selection criteria must be con-sidered in the decision process. The present study emphasizes on the development of a fuzzy multi-criteria decision making approach by integrating Fuzzy Set Theory and Weighted Distance Based Approximation. To show the consistency of the proposed approach, methodology valida-tion is also performed by making comparison with existing methodologies and to check the criti-cality of the selection criterion, sensitivity analysis is also performed.

  1. RT 24 - Architecture, Modeling & Simulation, and Software Design

    Science.gov (United States)

    2010-11-01

    focus on tool extensions (UPDM, SysML, SoaML, BPMN ) Leverage “best of breed” architecture methodologies Provide tooling to support the methodology DoDAF...Capability 10 Example: BPMN 11 DoDAF 2.0 MetaModel BPMN MetaModel Mapping SysML to DoDAF 2.0 12 DoDAF V2.0 Models OV-2 SysML Diagrams Requirement

  2. Mathematical Modeling of Thermofrictional Milling Process Using ANSYS WB Software

    National Research Council Canada - National Science Library

    K.T. Sherov; M.R. Sikhimbayev; A.K. Sherov; B.S. Donenbayev; A.K. Rakishev; A.B. Mazdubai; M.M. Musayev; A.M. Abeuova

    2017-01-01

    This article presents ANSYS WB-based mathematical modelling of the thermofrictional milling process, which allowed studying the dynamics of thermal and physical processes occurring during the processing...

  3. Software engineering the mixed model for genome-wide association studies on large samples.

    Science.gov (United States)

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  4. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......-defined and standardized process assets that can be reused, modified, and extended using a well-defined customization approach. Hence, process engineers can ground context-specific process variants in a standardized or domain-specific reference model that can be adapted to the respective context. We present an approach...

  5. Software Reuse of Mobile Systems based on Modelling

    Directory of Open Access Journals (Sweden)

    Guo Ping

    2016-01-01

    Full Text Available This paper presents an architectural style based modelling approach for architectural design, analysis of mobile systems. The approach is developed based on UML-like meta models and graph transformation techniques to support sound methodological principals, formal analysis and refinement. The approach could support mobile system development.

  6. Benchmark Dose Software Development and Maintenance Ten Berge Cxt Models

    Science.gov (United States)

    This report is intended to provide an overview of beta version 1.0 of the implementation of a concentration-time (CxT) model originally programmed and provided by Wil ten Berge (referred to hereafter as the ten Berge model). The recoding and development described here represent ...

  7. Software Design Modelling with Functional Petri Nets | Bakpo ...

    African Journals Online (AJOL)

    Petri Nets use two basic primitives: events and conditions to view or model a system. Events are the actions that take place in the system. The occurrence of events is controlled by the "state" of the system, which can be described as a set of conditions. An immediate application of such a model is in the control structures of ...

  8. Modelling the X-ray powder diffraction of nitrogen-expanded austenite using the Debye formula

    DEFF Research Database (Denmark)

    Oddershede, Jette; Christiansen, Thomas; Ståhl, Kenny

    2008-01-01

    Stress-free and homogeneous samples of nitrogen-expanded austenite, a defect-rich f.c.c. structure with a high interstitial nitrogen occupancy (between 0.36 and 0.61), have been studied using X-ray powder diffraction and Debye simulations. The simulations confirm the presence of deformation stack...

  9. Harmonic Domain Modeling of a Distribution System Using the DIgSILENT PowerFactory Software

    DEFF Research Database (Denmark)

    Wasilewski, J.; Wiechowski, Wojciech Tomasz; Bak, Claus Leth

    The first part of this paper presents the comparison between two models of distribution system created in computer simulation software PowerFactory (PF). Model A is an exciting simplified equivalent model of the distribution system used by Transmission System Operator (TSO) Eltra for balenced load...

  10. Design and Use of CSP Meta-Model for Embedded Control Software Development

    NARCIS (Netherlands)

    Bezemer, M.M.; Wilterdink, R.J.W.; Broenink, Johannes F.; Welch, Peter H.; Barnes, Frederick R.M.; Chalmers, Kevin; Baekgaard Pedersen, Jan; Sampson, Adam T.

    Software that is used to control machines and robots must be predictable and reliable. Model-Driven Design (MDD) techniques are used to comply with both the technical and business needs. This paper introduces a CSP meta-model that is suitable for these MDD techniques. The meta-model describes the

  11. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...

  12. BioModels.net Web Services, a free and integrated toolkit for computational modelling software.

    Science.gov (United States)

    Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille

    2010-05-01

    Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.

  13. The OPAL Project: Open source Procedure for Assessment of Loss using Global Earthquake Modelling software

    Science.gov (United States)

    Daniell, James

    2010-05-01

    This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an "Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software" (OPAL). The OPAL procedure has been developed to provide a framework for optimisation of a Global Earthquake Modelling process through: 1) Overview of current and new components of earthquake loss assessment (vulnerability, hazard, exposure, specific cost and technology); 2) Preliminary research, acquisition and familiarisation with all available ELE software packages; 3) Assessment of these 30+ software packages in order to identify the advantages and disadvantages of the ELE methods used; and 4) Loss analysis for a deterministic earthquake (Mw7.2) for the Zeytinburnu district, Istanbul, Turkey, by applying 3 software packages (2 new and 1 existing): a modified displacement-based method based on DBELA (Displacement Based Earthquake Loss Assessment), a capacity spectrum based method HAZUS (HAZards United States) and the Norwegian HAZUS-based SELENA (SEismic Loss EstimatioN using a logic tree Approach) software which was adapted for use in order to compare the different processes needed for the production of damage, economic and social loss estimates. The modified DBELA procedure was found to be more computationally expensive, yet had less variability, indicating the need for multi-tier approaches to global earthquake loss estimation. Similar systems planning and ELE software produced through the OPAL procedure can be applied to worldwide applications, given exposure data. Keywords: OPAL, displacement-based, DBELA, earthquake loss estimation, earthquake loss assessment, open source, HAZUS

  14. Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software (OPAL)

    Science.gov (United States)

    Daniell, J. E.

    2011-07-01

    This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an "Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software" (OPAL). The OPAL procedure was created to provide a framework for optimisation of a Global Earthquake Modelling process through: 1. overview of current and new components of earthquake loss assessment (vulnerability, hazard, exposure, specific cost, and technology); 2. preliminary research, acquisition, and familiarisation for available ELE software packages; 3. assessment of these software packages in order to identify the advantages and disadvantages of the ELE methods used; and 4. loss analysis for a deterministic earthquake (Mw = 7.2) for the Zeytinburnu district, Istanbul, Turkey, by applying 3 software packages (2 new and 1 existing): a modified displacement-based method based on DBELA (Displacement Based Earthquake Loss Assessment, Crowley et al., 2006), a capacity spectrum based method HAZUS (HAZards United States, FEMA, USA, 2003) and the Norwegian HAZUS-based SELENA (SEismic Loss EstimatioN using a logic tree Approach, Lindholm et al., 2007) software which was adapted for use in order to compare the different processes needed for the production of damage, economic, and social loss estimates. The modified DBELA procedure was found to be more computationally expensive, yet had less variability, indicating the need for multi-tier approaches to global earthquake loss estimation. Similar systems planning and ELE software produced through the OPAL procedure can be applied to worldwide applications, given exposure data.

  15. Control-Theoretic Decision Support for Mitigation of Modeled Software Project Cost Overruns

    OpenAIRE

    Miller, Scott David

    2013-01-01

    Despite sixty years of practice, the production of software remains an endeavor that is difficult to manage according to a schedule. Control theory studies the ability to influence the dynamical behavior of systems to achieve desired behaviors or eliminate undesired behaviors. In this work, the management problem of software project schedule adherence is re-cast as a problem in control theory. Below, a modeling framework is proposed for capturing the constraints and dependencies found in t...

  16. Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models

    Science.gov (United States)

    Chu, A.

    2014-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.

  17. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  18. From project-oriented to service-oriented software development: an industrial experience guided by a service reference model

    National Research Council Canada - National Science Library

    Kalinowski, Marcos; Biffl, Stefan; Spínola, Rodrigo Oliveira; Reinehr, Sheila

    2014-01-01

    ...-neutral project-oriented software deliveries.This article reports on the industrial experience of restructuring the supplier-side software development process into a value-based service-oriented format, guided by a service reference model...

  19. Multi-physics fluid-structure interaction modelling software

    CSIR Research Space (South Africa)

    Malan, AG

    2008-11-01

    Full Text Available The CSIR reseachers developed a new ground-breaking sofware modelling technlogies to be used in the design of safe and efficient next-generation aircraft. The field of Fluid-structure interaction (FIS) covers a massive range of engineering problems...

  20. Development of a plug-in for Variability Modeling in Software Product Lines

    Directory of Open Access Journals (Sweden)

    María Lucía López-Araujo

    2012-03-01

    Full Text Available Las Líneas de Productos de Software (LPS toman ventaja económica de las similitudes y variación entre un conjunto de sistemas de software dentro de un dominio específico. La Ingeniería de Líneas de Productos de Software por lo tanto, define una serie de procesos para el desarrollo de LPS que consideran las similitudes y variación a lo largo del ciclo devida. El modelado de variabilidad, en consecuencia, es una actividad esencial en un enfoque de Ingeniería de Líneas de Productos de Software. Existen varias técnicas para modelado de variabilidad. Entre ellas resalta COVAMOF que permite modelar los puntos de variación, variantes y dependencias como entidades de primera clase, proporcionando una manera uniforme de representarlos en los diversos niveles de abstracción de una LPS. Para poder aprovechar los beneficios de COVAMOF es necesario contar con una herramienta, de otra manera el modelado y la administración de la variabilidad pueden resultar una labor ardua para el ingeniero de software. Este trabajo presenta el desarrollo de un plug-in de COVAMOF para Eclipse.Software Product Lines (SPL take economic advantage of commonality and variability among a set of software systems that exist within a specific domain. Therefore, Software Product Line Engineering defines a series of processes for the development of a SPL that consider commonality and variability during the software life cycle. Variability modeling is therefore an essential activity in a Software Product Line Engineering approach. There are several techniques for variability modeling nowadays. COVAMOF stands out among them since it allows the modeling of variation points, variants and dependencies as first class elements. COVAMOF, therefore, provides an uniform manner for representing such concepts in different levels of abstraction within a SPL. In order to take advantage of COVAMOF benefits, it is necessary to have a computer aided tool, otherwise variability modeling and

  1. A testing-coverage software reliability model considering fault removal efficiency and error generation.

    Directory of Open Access Journals (Sweden)

    Qiuying Li

    Full Text Available In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP. During the past four decades, many software reliability growth models (SRGMs based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1 it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2 as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.

  2. A testing-coverage software reliability model considering fault removal efficiency and error generation.

    Science.gov (United States)

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.

  3. Task Models and System Models as A Bridge Between Hci and Software Engineering

    Science.gov (United States)

    Navarre, David; Palanque, Philippe; Winckler, Marco

    This chapter claims that task models per se do not contain sufficient and necessary information to permit automatic generation of interactive systems. Beyond this, we claim that they must not contain sufficient and necessary information otherwise they could no longer be considered as task models. On the contrary we propose a way of exploiting in a synergistic way task models with other models to be built during the development process. This chapter presents a set of tools supporting the development of interactive systems using two different notations. One of these notations called ConcurTaskTree (CTT) is used for task modeling. The other notation called Interactive Cooperative Objects (ICO) is used for system modeling. Even though these two kinds of models represent two different views of the same world (a user interacting with an interactive system), they are built by different people (human factors specialist for the task models and software engineer for the system models) and are used independently. The aim of this chapter is to propose the use of scenarios as a bridge between these two views. On the task modeling side, scenarios are seen as a possible trace of user’s activity. On the system side, scenarios are seen as a trace of user’s actions. This generic approach is presented on a case study in the domain of Air Traffic Control. As both CTT and ICO notations are tool supported (environments are respectively CTTE and PetShop) an integration tool based on this notion of scenarios is presented. Its use on the selected case study is also presented in detail.

  4. Development of a new model to predict indoor daylighting: Integration in CODYRUN software and validation

    Energy Technology Data Exchange (ETDEWEB)

    Fakra, A.H., E-mail: fakra@univ-reunion.f [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France); Miranville, F.; Boyer, H.; Guichard, S. [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France)

    2011-07-15

    Research highlights: {yields} This study presents a new model capable to simulate indoor daylighting. {yields} The model was introduced in research software called CODYRUN. {yields} The validation of the code was realized from a lot of tests cases. -- Abstract: Many models exist in the scientific literature for determining indoor daylighting values. They are classified in three categories: numerical, simplified and empirical models. Nevertheless, each of these categories of models are not convenient for every application. Indeed, the numerical model requires high calculation time; conditions of use of the simplified models are limited, and experimental models need not only important financial resources but also a perfect control of experimental devices (e.g. scale model), as well as climatic characteristics of the location (e.g. in situ experiment). In this article, a new model based on a combination of multiple simplified models is established. The objective is to improve this category of model. The originality of our paper relies on the coupling of several simplified models of indoor daylighting calculations. The accuracy of the simulation code, introduced into CODYRUN software to simulate correctly indoor illuminance, is then verified. Besides, the software consists of a numerical building simulation code, developed in the Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT) at the University of Reunion. Initially dedicated to the thermal, airflow and hydrous phenomena in the buildings, the software has been completed for the calculation of indoor daylighting. New models and algorithms - which rely on a semi-detailed approach - will be presented in this paper. In order to validate the accuracy of the integrated models, many test cases have been considered as analytical, inter-software comparisons and experimental comparisons. In order to prove the accuracy of the new model - which can properly simulate the illuminance - a

  5. Optical Thin Film Modeling: Using FTG's FilmStar Software

    Science.gov (United States)

    Freese, Scott

    2009-01-01

    Every material has basic optical properties that define its interaction with light: The index of refraction (n) and extinction coefficient (k) vary for the material as a function of the wavelength of the incident light. Also significant are the phase velocity and polarization of the incident light These inherent properties allow for the accurate modeling of light s behavior upon contact with a surface: Reflectance, Transmittance, Absorptance.

  6. Mathematical Modeling of Thermofrictional Milling Process Using ANSYS WB Software

    Science.gov (United States)

    Sherov, K. T.; Sikhimbayev, M. R.; Sherov, A. K.; Donenbayev, B. S.; Rakishev, A. K.; Mazdubai, A. B.; Musayev, M. M.; Abeuova, A. M.

    2017-06-01

    This article presents ANSYS WB-based mathematical modelling of the thermofrictional milling process, which allowed studying the dynamics of thermal and physical processes occurring during the processing. The technique used also allows determination of the optimal cutting conditions of thermofrictional milling for processing various materials, in particular steel 40CN2MA, 30CGSA, 45, 3sp. In our study, from among a number of existing models of cutting fracture, we chose the criterion first proposed by prof. V. L. Kolmogorov. In order to increase the calculations performance, a mathematical model was proposed, that used only two objects: a parallelepiped-shaped workpiece and a cutting insert in the form of a pentagonal prism. In addition, the work takes into account the friction coefficient between a cutting insert and a workpiece taken equal to 0.4 mm. To determine the temperature in the subcontact layer of the workpiece, we introduced the coordinates of nine characteristic points with the same interval in the local coordinate system. As a result, the temperature values were obtained for different materials at the studied points during the cutter speed change. The research results showed the possibility of controlling thermal processes during processing by choosing the optimum cutting modes.

  7. Expanding wave solutions of the Einstein equations that induce an anomalous acceleration into the Standard Model of Cosmology.

    Science.gov (United States)

    Temple, Blake; Smoller, Joel

    2009-08-25

    We derive a system of three coupled equations that implicitly defines a continuous one-parameter family of expanding wave solutions of the Einstein equations, such that the Friedmann universe associated with the pure radiation phase of the Standard Model of Cosmology is embedded as a single point in this family. By approximating solutions near the center to leading order in the Hubble length, the family reduces to an explicit one-parameter family of expanding spacetimes, given in closed form, that represents a perturbation of the Standard Model. By introducing a comoving coordinate system, we calculate the correction to the Hubble constant as well as the exact leading order quadratic correction to the redshift vs. luminosity relation for an observer at the center. The correction to redshift vs. luminosity entails an adjustable free parameter that introduces an anomalous acceleration. We conclude (by continuity) that corrections to the redshift vs. luminosity relation observed after the radiation phase of the Big Bang can be accounted for, at the leading order quadratic level, by adjustment of this free parameter. The next order correction is then a prediction. Since nonlinearities alone could actuate dissipation and decay in the conservation laws associated with the highly nonlinear radiation phase and since noninteracting expanding waves represent possible time-asymptotic wave patterns that could result, we propose to further investigate the possibility that these corrections to the Standard Model might be the source of the anomalous acceleration of the galaxies, an explanation not requiring the cosmological constant or dark energy.

  8. Computer modeling and software development for unsteady chemical technological systems

    Directory of Open Access Journals (Sweden)

    Dolganov Igor

    2016-01-01

    Full Text Available The paper deals with mathematical modeling in transient conditions to create a computer system that can reflect the behavior of real industrial plants. Such systems can respond to complex and pressing questions about the stability of the industrial facilities and the time spent on transients passing through the unstable regimes. In addition, such systems have a kind of intelligence and predictive ability, as they consider partial integral and differential systems of equations that are based on physical and chemical nature of the processes occurring in devices of technological systems.

  9. Open-source Software for Exoplanet Atmospheric Modeling

    Science.gov (United States)

    Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph

    2018-01-01

    I will present a suite of self-standing open-source tools to model and retrieve exoplanet spectra implemented for Python. These include: (1) a Bayesian-statistical package to run Levenberg-Marquardt optimization and Markov-chain Monte Carlo posterior sampling, (2) a package to compress line-transition data from HITRAN or Exomol without loss of information, (3) a package to compute partition functions for HITRAN molecules, (4) a package to compute collision-induced absorption, and (5) a package to produce radiative-transfer spectra of transit and eclipse exoplanet observations and atmospheric retrievals.

  10. Algorithms and Software for Predictive and Perceptual Modeling of Speech

    CERN Document Server

    Atti, Venkatraman

    2010-01-01

    From the early pulse code modulation-based coders to some of the recent multi-rate wideband speech coding standards, the area of speech coding made several significant strides with an objective to attain high quality of speech at the lowest possible bit rate. This book presents some of the recent advances in linear prediction (LP)-based speech analysis that employ perceptual models for narrow- and wide-band speech coding. The LP analysis-synthesis framework has been successful for speech coding because it fits well the source-system paradigm for speech synthesis. Limitations associated with th

  11. APPLYING TEACHING-LEARNING TO ARTIFICIAL BEE COLONY FOR PARAMETER OPTIMIZATION OF SOFTWARE EFFORT ESTIMATION MODEL

    Directory of Open Access Journals (Sweden)

    THANH TUNG KHUAT

    2017-05-01

    Full Text Available Artificial Bee Colony inspired by the foraging behaviour of honey bees is a novel meta-heuristic optimization algorithm in the community of swarm intelligence algorithms. Nevertheless, it is still insufficient in the speed of convergence and the quality of solutions. This paper proposes an approach in order to tackle these downsides by combining the positive aspects of TeachingLearning based optimization and Artificial Bee Colony. The performance of the proposed method is assessed on the software effort estimation problem, which is the complex and important issue in the project management. Software developers often carry out the software estimation in the early stages of the software development life cycle to derive the required cost and schedule for a project. There are a large number of methods for effort estimation in which COCOMO II is one of the most widely used models. However, this model has some restricts because its parameters have not been optimized yet. In this work, therefore, we will present the approach to overcome this limitation of COCOMO II model. The experiments have been conducted on NASA software project dataset and the obtained results indicated that the improvement of parameters provided better estimation capabilities compared to the original COCOMO II model.

  12. A methodology for model-based development and automated verification of software for aerospace systems

    Science.gov (United States)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  13. Visco-elastic modelling for asphalt pavements : Software ViscoRoute

    OpenAIRE

    CHABOT, Armelle; TAMAGNY, Philippe; POCHE, Didier; DUHAMEL, Denis

    2006-01-01

    Huet-Sayegh model (1963) gives a set of constitutive equations of a viso-elastic material which accounts well for the behaviour of asphalt pavement layers, especially regarding thermal effects. This model allows rather good predictions of experimental data. The French pavement design method consists in a pavement mechanistic analysis based on the Burmister multilayer elastic model (1943) -LCPC Software ALIZE (1982)-. In that model the Huet-Sayegh behaviour is taken into account with its equiv...

  14. A Systems Thinking Model for Open Source Software Development in Social Media

    OpenAIRE

    Mustaquim, Moyen

    2010-01-01

    In this paper a social media model, based on systems thinking methodology is proposed to understand the behavior of the open source software development community working in social media.The proposed model is focused on relational influences of two different systems- social media and the open source community. This model can be useful for taking decisions which are complicated and where solutions are not apparent.Based on the proposed model, an efficient way of working in open source developm...

  15. Prediction Model for Object Oriented Software Development Effort Estimation Using One Hidden Layer Feed Forward Neural Network with Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Chandra Shekhar Yadav

    2014-01-01

    Full Text Available The budget computation for software development is affected by the prediction of software development effort and schedule. Software development effort and schedule can be predicted precisely on the basis of past software project data sets. In this paper, a model for object-oriented software development effort estimation using one hidden layer feed forward neural network (OHFNN has been developed. The model has been further optimized with the help of genetic algorithm by taking weight vector obtained from OHFNN as initial population for the genetic algorithm. Convergence has been obtained by minimizing the sum of squared errors of each input vector and optimal weight vector has been determined to predict the software development effort. The model has been empirically validated on the PROMISE software engineering repository dataset. Performance of the model is more accurate than the well-established constructive cost model (COCOMO.

  16. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  17. Computing and software

    Science.gov (United States)

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood

  18. On the Use of Variability Operations in the V-Modell XT Software Process Line

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel; Ternité, Thomas

    2016-01-01

    Software process lines provide a systematic approach to develop and manage software processes. It defines a reference process containing general process assets, whereas a well-defined customization approach allows process engineers to create new process variants, e.g., by extending or modifying....... In this article, we present a study on the feasibility of variability operations to support the development of software process lines in the context of the V-Modell XT. We analyze which variability operations are defined and practically used. We provide an initial catalog of variability operations...... as an improvement proposal for other process models. Our findings show that 69 variability operation types are defined across several metamodel versions of which, however, 25 remain unused. The found variability operations allow for systematically modifying the content of process model elements and the process...

  19. On Fundamental Evaluation Using Uav Imagery and 3d Modeling Software

    Science.gov (United States)

    Nakano, K.; Suzuki, H.; Tamino, T.; Chikatsu, H.

    2016-06-01

    Unmanned aerial vehicles (UAVs), which have been widely used in recent years, can acquire high-resolution images with resolutions in millimeters; such images cannot be acquired with manned aircrafts. Moreover, it has become possible to obtain a surface reconstruction of a realistic 3D model using high-overlap images and 3D modeling software such as Context capture, Pix4Dmapper, Photoscan based on computer vision technology such as structure from motion and multi-view stereo. 3D modeling software has many applications. However, most of them seem to not have obtained appropriate accuracy control in accordance with the knowledge of photogrammetry and/or computer vision. Therefore, we performed flight tests in a test field using an UAV equipped with a gimbal stabilizer and consumer grade digital camera. Our UAV is a hexacopter and can fly according to the waypoints for autonomous flight and can record flight logs. We acquired images from different altitudes such as 10 m, 20 m, and 30 m. We obtained 3D reconstruction results of orthoimages, point clouds, and textured TIN models for accuracy evaluation in some cases with different image scale conditions using 3D modeling software. Moreover, the accuracy aspect was evaluated for different units of input image—course unit and flight unit. This paper describes the fundamental accuracy evaluation for 3D modeling using UAV imagery and 3D modeling software from the viewpoint of close-range photogrammetry.

  20. ON FUNDAMENTAL EVALUATION USING UAV IMAGERY AND 3D MODELING SOFTWARE

    Directory of Open Access Journals (Sweden)

    K. Nakano

    2016-06-01

    Full Text Available Unmanned aerial vehicles (UAVs, which have been widely used in recent years, can acquire high-resolution images with resolutions in millimeters; such images cannot be acquired with manned aircrafts. Moreover, it has become possible to obtain a surface reconstruction of a realistic 3D model using high-overlap images and 3D modeling software such as Context capture, Pix4Dmapper, Photoscan based on computer vision technology such as structure from motion and multi-view stereo. 3D modeling software has many applications. However, most of them seem to not have obtained appropriate accuracy control in accordance with the knowledge of photogrammetry and/or computer vision. Therefore, we performed flight tests in a test field using an UAV equipped with a gimbal stabilizer and consumer grade digital camera. Our UAV is a hexacopter and can fly according to the waypoints for autonomous flight and can record flight logs. We acquired images from different altitudes such as 10 m, 20 m, and 30 m. We obtained 3D reconstruction results of orthoimages, point clouds, and textured TIN models for accuracy evaluation in some cases with different image scale conditions using 3D modeling software. Moreover, the accuracy aspect was evaluated for different units of input image—course unit and flight unit. This paper describes the fundamental accuracy evaluation for 3D modeling using UAV imagery and 3D modeling software from the viewpoint of close-range photogrammetry.

  1. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science

    Science.gov (United States)

    de Rigo, Daniele

    2013-04-01

    Computational aspects increasingly shape environmental sciences [1]. Actually, transdisciplinary modelling of complex and uncertain environmental systems is challenging computational science (CS) and also the science-policy interface [2-7]. Large spatial-scale problems falling within this category - i.e. wide-scale transdisciplinary modelling for environment (WSTMe) [8-10] - often deal with factors (a) for which deep-uncertainty [2,11-13] may prevent usual statistical analysis of modelled quantities and need different ways for providing policy-making with science-based support. Here, practical recommendations are proposed for tempering a peculiar - not infrequently underestimated - source of uncertainty. Software errors in complex WSTMe may subtly affect the outcomes with possible consequences even on collective environmental decision-making. Semantic transparency in CS [2,8,10,14,15] and free software [16,17] are discussed as possible mitigations (b) . Software uncertainty, black-boxes and free software. Integrated natural resources modelling and management (INRMM) [29] frequently exploits chains of nontrivial data-transformation models (D- TM), each of them affected by uncertainties and errors. Those D-TM chains may be packaged as monolithic specialized models, maybe only accessible as black-box executables (if accessible at all) [50]. For end-users, black-boxes merely transform inputs in the final outputs, relying on classical peer-reviewed publications for describing the internal mechanism. While software tautologically plays a vital role in CS, it is often neglected in favour of more theoretical aspects. This paradox has been provocatively described as "the invisibility of software in published science. Almost all published papers required some coding, but almost none mention software, let alone include or link to source code" [51]. Recently, this primacy of theory over reality [52-54] has been challenged by new emerging hybrid approaches [55] and by the

  2. Comparison of Software Models for Energy Savings from Cool Roofs

    Energy Technology Data Exchange (ETDEWEB)

    New, Joshua Ryan [ORNL; Miller, William A [ORNL; Huang, Yu (Joe) [White Box Technologies; Levinson, Ronnen [Lawrence Berkeley National Laboratory (LBNL)

    2014-01-01

    A web-based Roof Savings Calculator (RSC) has been deployed for the United States Department of Energy as an industry-consensus tool to help building owners, manufacturers, distributors, contractors and researchers easily run complex roof and attic simulations. This tool employs modern web technologies, usability design, and national average defaults as an interface to annual simulations of hour-by-hour, whole-building performance using the world-class simulation tools DOE-2.1E and AtticSim in order to provide estimated annual energy and cost savings. In addition to cool reflective roofs, RSC simulates multiple roof and attic configurations including different roof slopes, above sheathing ventilation, radiant barriers, low-emittance roof surfaces, duct location, duct leakage rates, multiple substrate types, and insulation levels. A base case and energy-efficient alternative can be compared side-by-side to estimate monthly energy. RSC was benchmarked against field data from demonstration homes in Ft. Irwin, California; while cooling savings were similar, heating penalty varied significantly across different simulation engines. RSC results reduce cool roofing cost-effectiveness thus mitigating expected economic incentives for this countermeasure to the urban heat island effect. This paper consolidates comparison of RSC s projected energy savings to other simulation engines including DOE-2.1E, AtticSim, Micropas, and EnergyPlus, and presents preliminary analyses. RSC s algorithms for capturing radiant heat transfer and duct interaction in the attic assembly are considered major contributing factors to increased cooling savings and heating penalties. Comparison to previous simulation-based studies, analysis on the force multiplier of RSC cooling savings and heating penalties, the role of radiative heat exchange in an attic assembly, and changes made for increased accuracy of the duct model are included.

  3. End-to-end observatory software modeling using domain specific languages

    Science.gov (United States)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  4. Towards a Complete Model for Software Component Deployment on Heterogeneous Platform

    Directory of Open Access Journals (Sweden)

    Švogor Ivan

    2014-12-01

    Full Text Available This report briefly describes an ongoing research related to optimization of allocating software components to heterogeneous computing platform (which includes CPU, GPU and FPGA. Research goal is also presented, along with current hot topics of the research area, related research teams, and finally results and contribution of my research. It involves mathematical modelling which results in goal function, optimization method which finds a suboptimal solution to the goal function and a software modeling tool which enables graphical representation of the problem at hand and help developers determine component placement in the system design phase.

  5. Investigation of the stator inductances of the expanded Park model and an approach on parameter identification using the evolution strategy

    Directory of Open Access Journals (Sweden)

    Schmuelling Christoph

    2016-09-01

    Full Text Available Commonly, the Park model is used to calculate transients or steady-state operations of synchronous machines. The expanded Park theory derives the Park equations from the phase-domain model of the synchronous machine by the use of transformations. Thereby, several hypothesis are made, which are under investigation in this article in respect to the main inductances of two different types of synchronous machines. It is shown, that the derivation of the Park equations from the phase-domain model does not lead to constant inductances, as it is usually assumed for these equations. Nevertheless the Park model is the most common analytic model of synchronous machines. Therefore, in the second part of this article a method using the evolution strategy is shown to obtain the parameters of the Park model.

  6. Critical Thinking Skills of Students through Mathematics Learning with ASSURE Model Assisted by Software Autograph

    Science.gov (United States)

    Kristianti, Y.; Prabawanto, S.; Suhendra, S.

    2017-09-01

    This study aims to examine the ability of critical thinking and students who attain learning mathematics with learning model ASSURE assisted Autograph software. The design of this study was experimental group with pre-test and post-test control group. The experimental group obtained a mathematics learning with ASSURE-assisted model Autograph software and the control group acquired the mathematics learning with the conventional model. The data are obtained from the research results through critical thinking skills tests. This research was conducted at junior high school level with research population in one of junior high school student in Subang Regency of Lesson Year 2016/2017 and research sample of class VIII student in one of junior high school in Subang Regency for 2 classes. Analysis of research data is administered quantitatively. Quantitative data analysis was performed on the normalized gain level between the two sample groups using a one-way anova test. The results show that mathematics learning with ASSURE assisted model Autograph software can improve the critical thinking ability of junior high school students. Mathematical learning using ASSURE-assisted model Autograph software is significantly better in improving the critical thinking skills of junior high school students compared with conventional models.

  7. Study on the usage of a commercial software (Comsol-Multiphysics®) for dislocation multiplication model

    Science.gov (United States)

    Gallien, B.; Albaric, M.; Duffar, T.; Kakimoto, K.; M'Hamdi, M.

    2017-01-01

    Elaboration of silicon ingots for photovoltaic application in Directional Solidification furnace leads to formation of dislocations mainly due to thermoelastic stresses, which impact photovoltaic conversion rate. Several research teams have created numerical simulation models using home-made software in order to study dislocation multiplication and predict the dislocation density and residual stresses inside ingots after elaboration. In this study, the commercial software Comsol-Multiphysics® is used to calculate the evolution of dislocation density during the ingot solidification and cooling. Thermo-elastic stress, due to temperature field inside the ingot during elaboration, is linked to the evolution of the dislocation density by the Alexander and Haasen model (A&H model). The purpose of this study is to show relevance of commercial software to predict dislocation density in ingots. In a first approach, A&H physical model is introduced for a 2D axisymmetric geometry. After a short introduction, modification of Comsol® software is presented in order to include A&H equations. This numerical model calculates dislocation density and plastic stress continuously during ingot solidification and cooling. Results of this model are then compared to home-made simulation created by the teams at Kyushu university and NTNU. Results are also compared to characterization of a silicon ingot elaborated in a gradient freeze furnace. Both of these comparisons shows the relevance of using a commercial code, as Comsol®, to predict dislocations multiplication in a silicon ingot during elaboration.

  8. Contribuição dos modelos de qualidade e maturidade na melhoria dos processos de software Contribution of quality and maturity models to software process improvement

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Tonini

    2008-01-01

    Full Text Available Grande parte das empresas desenvolvedoras de software criou seu próprio processo de trabalho. Devido à rápida expansão do mercado de software, a concorrência ocorre muito mais em custo do que em diferenciação. Para obter vantagem competitiva, as empresas devem atualizar-se continuamente na tecnologia, buscar a maturidade nos processos e eliminar a ineficiência operacional. Isso requer um envolvimento das pessoas, dos processos e da organização como um todo. O artigo discute a implementação de melhorias nos processos de software segundo os principais modelos de qualidade e de maturidade. Com base em um Estudo de Casos Múltiplos, verifica-se que a melhoria dos processos de software requer que a melhoria ocorra primeiramente entre cada um dos desenvolvedores e, a seguir, envolva os grupos de desenvolvimento e por fim, a organização como um todo. A pesquisa conclui que os modelos de qualidade e maturidade servem como orientadores do processo de melhoria.Many software development companies have developed their own work method. Due to the fast software market growth, the competition focuses more on cost than on differentiation. To achieve competitive advantage, software developer organizations must continually update their technology, reach high level process maturity and eliminate all the operational inefficiency. These procedures involve people, processes and the whole organization. The aim of the paper is to discuss software process improvement implementation according to the most important quality and maturity models. Based on a Multiple Case Study, it is verified that the software process improvement needs firstly individual improvement and, later, it involves the developer teams and the whole organization. The research concludes that the quality and maturity models must be used as improvement process drivers.

  9. Towards the Significance of Decision Aid in Building Information Modeling (BIM Software Selection Process

    Directory of Open Access Journals (Sweden)

    Omar Mohd Faizal

    2014-01-01

    Full Text Available Building Information Modeling (BIM has been considered as a solution in construction industry to numerous problems such as delays, increased lead in times and increased costs. This is due to the concept and characteristic of BIM that will reshaped the way construction project teams work together to increase productivity and improve the final project outcomes (cost, time, quality, safety, functionality, maintainability, etc.. As a result, the construction industry has witnesses numerous of BIM software available in market. Each of this software has offers different function, features. Furthermore, the adoption of BIM required high investment on software, hardware and also training expenses. Thus, there is indentified that there is a need of decision aid for appropriated BIM software selection that fulfill the project needs. However, research indicates that there is limited study attempt to guide decision in BIM software selection problem. Thus, this paper highlight the importance of decision making and support for BIM software selection as it is vital to increase productivity, construction project throughout building lifecycle.

  10. The review of the modeling methods and numerical analysis software for nanotechnology in material science

    Directory of Open Access Journals (Sweden)

    SMIRNOV Vladimir Alexeevich

    2014-10-01

    Full Text Available Due to the high demand for building materials with universal set of roperties which extend their application area the research efforts are focusing on nanotechnology in material science. The rational combination of theoretical studies, mathematical modeling and simulation can favour reduced resource and time consumption when nanomodified materials are being developed. The development of composite material is based on the principles of system analysis which provides for the necessity of criteria determination and further classification of modeling methods. In this work the criteria of spatial scale, dominant type of interaction and heterogeneity are used for such classification. The presented classification became a framework for analysis of methods and software which can be applied to the development of building materials. For each of selected spatial levels - from atomistic one to macrostructural level of constructional coarsegrained composite – existing theories, modeling algorithms and tools have been considered. At the level of macrostructure which is formed under influence of gravity and exterior forces one can apply probabilistic and geometrical methods to study obtained structure. The existing models are suitable for packing density analysis and solution of percolation problems at the macroscopic level, but there are still no software tools which could be applied in nanotechnology to carry out systematic investigations. At the microstructure level it’s possible to use particle method along with probabilistic and statistical methods to explore structure formation but available software tools are partially suitable for numerical analysis of microstructure models. Therefore, modeling of the microstructure is rather complicated; the model has to include potential of pairwise interaction. After the model has been constructed and parameters of pairwise potential have been determined, many software packages for solution of ordinary

  11. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  12. New Modelling Capabilities in Commercial Software for High-Gain Antennas

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Lumholt, Michael; Meincke, Peter

    2012-01-01

    type of EM software tool aimed at extending the ways engineers can use antenna measurements in the antenna design process. The tool allows reconstruction of currents and near fields on a 3D surface conformal to the antenna, by using the measured antenna field as input. The currents on the antenna......This paper presents an overview of selected new modelling algorithms and capabilities in commercial software tools developed by TICRA. A major new area is design and analysis of printed reflectarrays where a fully integrated design environment is under development, allowing fast and accurate...... characterization of the reflectarray element, an initial phaseonly synthesis, followed by a full optimization procedure taking into account the near-field from the feed and the finite extent of the array. Another interesting new modelling capability is made available through the DIATOOL software, which is a new...

  13. A MODEL FOR INTEGRATED SOFTWARE TO IMPROVE COMMUNICATION POLICY IN DENTAL TECHNICAL LABS

    Directory of Open Access Journals (Sweden)

    Minko M. Milev

    2017-06-01

    Full Text Available Introduction: Integrated marketing communications (IMC are all kinds of communications between organisations and customers, partners, other organisations and society. Aim: To develop and present an integrated software model, which can improve the effectiveness of communications in dental technical services. Material and Methods: The model of integrated software is based on recommendations of a total of 700 respondents (students of dental technology, dental physicians, dental technicians and patients of dental technical laboratories in Northeastern Bulgaria. Results and Discussion: We present the benefits of future integrated software to improve the communication policy in the dental technical laboratory that meets the needs of fast cooperation and well-built communicative network between dental physicians, dental technicians, patients and students. Conclusion: The use of integrated communications could be a powerful unified approach to improving the communication policy between all players at the market of dental technical services.

  14. Entrepreneurial model based technology creative industries sector software through the use of free open source software for Universitas Pendidikan Indonesia students

    Science.gov (United States)

    Hasan, B.; Hasbullah; Purnama, W.; Hery, A.

    2016-04-01

    Creative industry development areas of software by using Free Open Source Software (FOSS) is expected to be one of the solutions to foster new entrepreneurs of the students who can open job opportunities and contribute to economic development in Indonesia. This study aims to create entrepreneurial coaching model based on the creative industries by utilizing FOSS software field as well as provide understanding and fostering entrepreneurial creative industries based field software for students of Universitas Pendidikan Indonesia. This activity phase begins with identifying entrepreneurs or business software technology that will be developed, training and mentoring, apprenticeship process at industrial partners, creation of business plans and monitoring and evaluation. This activity involves 30 UPI student which has the motivation to self-employment and have competence in the field of information technology. The results and outcomes expected from these activities is the birth of a number of new entrepreneurs from the students engaged in the software industry both software in the world of commerce (e-commerce) and education/learning (e-learning/LMS) and games.

  15. Semantic Model of Variability and Capabilities of IoT Applications for Embedded Software Ecosystems

    DEFF Research Database (Denmark)

    Tomlein, Matus; Grønbæk, Kaj

    2016-01-01

    have also identified a need to model their deployment topology and functionality in order to enable their seamless integration into the platform. In this paper, we draw from research in related fields and present a model of IoT applications. It is built using semantic annotations and uses semantic...... conclude that it is suitable for modeling applications in IoT software ecosystems since it is more adaptable and expressive than the alternatives....

  16. Implementation of Software Configuration Management Process by Models: Practical Experiments and Learned Lessons

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-12-01

    Full Text Available Nowadays software configuration management process is not only dilemma which system should be used for version control or how to merge changes from one source code branch to other. There are multiple tasks such as version control, build management, deploy management, status accounting, bug tracking and many others that should be solved to support full configuration management process according to most popular quality standards. The main scope of the mentioned process is to include only valid and tested software items to final version of product and prepare a new version as soon as possible. To implement different tasks of software configuration management process, a set of different tools, scripts and utilities should be used. The current paper provides a new model-based approach to implementation of configuration management. Using different models, a new approach helps to organize existing solutions and develop new ones by a parameterized way, thus increasing reuse of solutions. The study provides a general description of new model-based conception and definitions of all models needed to implement a new approach. The second part of the paper contains an overview of criteria, practical experiments and lessons learned from using new models in software configuration management. Finally, further works are defined based on results of practical experiments and lessons learned.

  17. DBSolve Optimum: a software package for kinetic modeling which allows dynamic visualization of simulation results

    Directory of Open Access Journals (Sweden)

    Gizzatkulov Nail M

    2010-08-01

    Full Text Available Abstract Background Systems biology research and applications require creation, validation, extensive usage of mathematical models and visualization of simulation results by end-users. Our goal is to develop novel method for visualization of simulation results and implement it in simulation software package equipped with the sophisticated mathematical and computational techniques for model development, verification and parameter fitting. Results We present mathematical simulation workbench DBSolve Optimum which is significantly improved and extended successor of well known simulation software DBSolve5. Concept of "dynamic visualization" of simulation results has been developed and implemented in DBSolve Optimum. In framework of the concept graphical objects representing metabolite concentrations and reactions change their volume and shape in accordance to simulation results. This technique is applied to visualize both kinetic response of the model and dependence of its steady state on parameter. The use of the dynamic visualization is illustrated with kinetic model of the Krebs cycle. Conclusion DBSolve Optimum is a user friendly simulation software package that enables to simplify the construction, verification, analysis and visualization of kinetic models. Dynamic visualization tool implemented in the software allows user to animate simulation results and, thereby, present them in more comprehensible mode. DBSolve Optimum and built-in dynamic visualization module is free for both academic and commercial use. It can be downloaded directly from http://www.insysbio.ru.

  18. Expanding subjectivities

    DEFF Research Database (Denmark)

    Lundgaard Andersen, Linda; Soldz, Stephen

    2012-01-01

    A major theme in recent psychoanalytic thinking concerns the use of therapist subjectivity, especially “countertransference,” in understanding patients. This thinking converges with and expands developments in qualitative research regarding the use of researcher subjectivity as a tool to understa...

  19. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  20. HydroShare: Applying professional software engineering to a new NSF-funded large software project

    Science.gov (United States)

    Idaszak, R.; Tarboton, D. G.; Ames, D.; Saleem Arrigo, J. A.; Band, L. E.; Bedig, A.; Castronova, A. M.; Christopherson, L.; Coposky, J.; Couch, A.; Dash, P.; Gan, T.; Goodall, J.; Gustafson, K.; Heard, J.; Hooper, R. P.; Horsburgh, J. S.; Jackson, S.; Johnson, H.; Maidment, D. R.; Mbewe, P.; Merwade, V.; Miles, B.; Reeder, S.; Russell, T.; Song, C.; Taylor, A.; Thakur, S.; Valentine, D. W.; Whiteaker, T. L.

    2013-12-01

    HydroShare is an online, collaborative system being developed for sharing hydrologic data and models as part of the NSF's Software Infrastructure for Sustained Innovation (SI2) program (NSF collaborative award numbers 1148453 and 1148090). HydroShare involves a large software development effort requiring cooperative research and distributed software development between domain scientists, professional software engineers (here 'professional' denotes previous commercial experience in the application of modern software engineering), and university software developers. HydroShare expands upon the data sharing capabilities of the Hydrologic Information System of the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI) by broadening the classes of data accommodated, expanding capability to include the sharing of models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. With a goal of enabling better science concomitant with improved sustainable software practices, we will describe our approach, experiences, and lessons learned thus-far in applying professional software engineering to a large NSF-funded software project from the project's onset.

  1. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    Science.gov (United States)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  2. Towards a framework for deriving platform-independent model-driven software product lines

    Directory of Open Access Journals (Sweden)

    Andrés Paz

    2013-05-01

    Full Text Available Model-driven software product lines (MD-SPLs are created from domain models which are transformed, merged and composed with reusable core assets, until software products are produced. Model transformation chains (MTCs must be specified to generate such MD-SPLs. This paper presents a framework for creating platform-independent MD-SPLs; such framework includes a domain specific language (DSL for platform-independent MTC specification and facilities platform-specific MTC generation of several of the most used model transformation frameworks. The DSL also allows product line architects to compose generation taking the need for model transformation strategy and technology interoperability into account and specifying several types of variability involved in such generation.

  3. Accuracy of open-source software segmentation and paper-based printed three-dimensional models.

    Science.gov (United States)

    Szymor, Piotr; Kozakiewicz, Marcin; Olszewski, Raphael

    2016-02-01

    In this study, we aimed to verify the accuracy of models created with the help of open-source Slicer 3.6.3 software (Surgical Planning Lab, Harvard Medical School, Harvard University, Boston, MA, USA) and the Mcor Matrix 300 paper-based 3D printer. Our study focused on the accuracy of recreating the walls of the right orbit of a cadaveric skull. Cone beam computed tomography (CBCT) of the skull was performed (0.25-mm pixel size, 0.5-mm slice thickness). Acquired DICOM data were imported into Slicer 3.6.3 software, where segmentation was performed. A virtual model was created and saved as an .STL file and imported into Netfabb Studio professional 4.9.5 software. Three different virtual models were created by cutting the original file along three different planes (coronal, sagittal, and axial). All models were printed with a Selective Deposition Lamination Technology Matrix 300 3D printer using 80 gsm A4 paper. The models were printed so that their cutting plane was parallel to the paper sheets creating the model. Each model (coronal, sagittal, and axial) consisted of three separate parts (∼200 sheets of paper each) that were glued together to form a final model. The skull and created models were scanned with a three-dimensional (3D) optical scanner (Breuckmann smart SCAN) and were saved as .STL files. Comparisons of the orbital walls of the skull, the virtual model, and each of the three paper models were carried out with GOM Inspect 7.5SR1 software. Deviations measured between the models analysed were presented in the form of a colour-labelled map and covered with an evenly distributed network of points automatically generated by the software. An average of 804.43 ± 19.39 points for each measurement was created. Differences measured in each point were exported as a .csv file. The results were statistically analysed using Statistica 10, with statistical significance set at p < 0.05. The average number of points created on models for each measurement was 804

  4. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  5. On model-driven design of robot software using co-simulation

    NARCIS (Netherlands)

    Broenink, Johannes F.; Ni, Yunyun; Groothuis, M.A.; Menegatti, E.

    2010-01-01

    In this paper we show that using co-simulation for robot software design will be more efficient than without co-simulation. We will show an example of the plotter how the co-simulation is helping with the design process. We believe that a collaborative methodology based on model-driven design will

  6. GSEVM v.2: MCMC software to analyse genetically structured environmental variance models

    DEFF Research Database (Denmark)

    Ibáñez-Escriche, N; Garcia, M; Sorensen, D

    2010-01-01

    This note provides a description of software that allows to fit Bayesian genetically structured variance models using Markov chain Monte Carlo (MCMC). The gsevm v.2 program was written in Fortran 90. The DOS and Unix executable programs, the user's guide, and some example files are freely availab...

  7. A quantitative analysis of model-driven code generation through software experimentation

    NARCIS (Netherlands)

    Papotti, Paulo Eduardo; do Prado, Antonio Francisco; Lopes de Souza, Wanderley; Cirilo, Carlos Eduardo; Ferreira Pires, Luis; Salinesi, C.; Norrie, M.C.; Pastor, O

    Recent research results have shown that Model-Driven Development (MDD) is a beneficial approach to develop software systems. The reduction of development time enabled by code generation mechanisms is often acknowledged as an important benefit to be further explored. This paper reports on an

  8. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong,T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. In I. A. Sánchez & P. Isaías (Eds.), Proceedings of the IADIS Mobile Learning Conference 2008 (pp. 206-210). April, 11-13, 2008, Carvoeiro, Portugal.

  9. Accumulation of biomass and mineral elements with calendar time by corn: application of the expanded growth model.

    Directory of Open Access Journals (Sweden)

    Allen R Overman

    Full Text Available The expanded growth model is developed to describe accumulation of plant biomass (Mg ha(-1 and mineral elements (kg ha(-1 in with calendar time (wk. Accumulation of plant biomass with calendar time occurs as a result of photosynthesis for green land-based plants. A corresponding accumulation of mineral elements such as nitrogen, phosphorus, and potassium occurs from the soil through plant roots. In this analysis, the expanded growth model is tested against high quality, published data on corn (Zea mays L. growth. Data from a field study in South Carolina was used to evaluate the application of the model, where the planting time of April 2 in the field study maximized the capture of solar energy for biomass production. The growth model predicts a simple linear relationship between biomass yield and the growth quantifier, which is confirmed with the data. The growth quantifier incorporates the unit processes of distribution of solar energy which drives biomass accumulation by photosynthesis, partitioning of biomass between light-gathering and structural components of the plants, and an aging function. A hyperbolic relationship between plant nutrient uptake and biomass yield is assumed, and is confirmed for the mineral elements nitrogen (N, phosphorus (P, and potassium (K. It is concluded that the rate limiting process in the system is biomass accumulation by photosynthesis and that nutrient accumulation occurs in virtual equilibrium with biomass accumulation.

  10. Large-scale computation of the exponentially expanding universe in a simplified Lorentzian type IIB matrix model

    CERN Document Server

    Ito, Yuta; Tsuchiya, Asato

    2015-01-01

    The type IIB matrix model is a conjectured nonperturbative formulation of superstring theory. Recent studies on the Lorentzian version of the model have shown that only three out of nine spatial directions start to expand after some critical time. On the other hand, due to the unbounded action of the Lorentzian model, one has to introduce infrared (IR) cutoffs in order to make the partition function finite. In this work we investigate whether the effects of the IR cutoffs disappear in the infinite volume limit. For that purpose, we study a simplified model with large matrix size up to $N=256$ by Monte Carlo simulation. First we confirm the exponentially expanding behavior of the "universe". Then we generalize the form of the IR cutoffs by one parameter, and find that the results become universal in some region of the parameter. It is suggested that the effects of IR cutoffs disappear in this region, which is confirmed also from the studies of Schwinger-Dyson equations.

  11. Charging Customers or Making Profit? Business Model Change in the Software Industry

    Directory of Open Access Journals (Sweden)

    Margit Malmmose Peyton

    2014-08-01

    Full Text Available Purpose: Advancements in technology, changing customer demands or new market entrants are often seen as a necessary condition to trigger the creation of new Business Models, or disruptive change in existing ones. Yet, the sufficient condition is often determined by pricing and how customers are willing to pay for the technology (Chesbrough and Rosenbloom, 2002. As a consequence, much research on Business Models has focused on innovation and technology management (Rajala et al., 2012; Zott et al., 2011, and software-specific frameworks for Business Models have emerged (Popp, 2011; Rajala et al., 2003; Rajala et al., 2004; Stahl, 2004. This paper attempts to illustrate Business Model change in the software industry. Design: Drawing on Rajala et al. (2003, this case study explores the (1 antecedents and (2 consequences of a Business Model-change in a logistics software company. The company decided to abolish their profitable fee-based licensing for an internet-based version of its core product and to offer it as freeware including unlimited service. Findings: Firstly, we illustrate how external developments in technology and customer demands (pricing, as well as the desire for a sustainable Business Model, have led to this drastic change. Secondly, we initially find that much of the company’s new Business Model is congruent with the company-focused framework of Rajala et al. (2003 [product strategy; distribution model, services and implementation; revenue logic]. Value: The existing frameworks for Business Models in the software industry cannot fully explain the disruptive change in the Business Model. Therefore, we suggest extending the framework by the element of ‘innovation’.

  12. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  13. A systematic literature review of open source software quality assessment models.

    Science.gov (United States)

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  14. An expanded One Health model: integrating social science and One Health to inform study of the human-animal interface.

    Science.gov (United States)

    Woldehanna, Sara; Zimicki, Susan

    2015-03-01

    Zoonotic disease emergence is not a purely biological process mediated only by ecologic factors; opportunities for transmission of zoonoses from animals to humans also depend on how people interact with animals. While exposure is conditioned by the type of animal and the location in which interactions occur, these in turn are influenced by human activity. The activities people engage in are determined by social as well as contextual factors including gender, age, socio-economic status, occupation, social norms, settlement patterns and livelihood systems, family and community dynamics, as well as national and global influences. This paper proposes an expanded "One Health" conceptual model for human-animal exposure that accounts for social as well as epidemiologic factors. The expanded model informed a new study approach to document the extent of human exposure to animals and explore the interplay of social and environmental factors that influence risk of transmission at the individual and community level. The approach includes a formative phase using qualitative and participatory methods, and a representative, random sample survey to quantify exposure to animals in a variety of settings. The paper discusses the different factors that were considered in developing the approach, including the range of animals asked about and the parameters of exposure that are included, as well as factors to be considered in local adaptation of the generic instruments. Illustrative results from research using this approach in Lao PDR are presented to demonstrate the effect of social factors on how people interact with animals. We believe that the expanded model can be similarly operationalized to explore the interactions of other social and policy-level determinants that may influence transmission of zoonoses. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Proposed Robot Scheme with 5 DoF and Dynamic Modelling Using Maple Software

    Directory of Open Access Journals (Sweden)

    Shala Ahmet

    2017-11-01

    Full Text Available In this paper is represented Dynamical Modelling of robots which is commonly first important step of Modelling, Analysis and Control of robotic systems. This paper is focused on using Denavit-Hartenberg (DH convention for kinematics and Newton-Euler Formulations for dynamic modelling of 5 DoF - Degree of Freedom of 3D robot. The process of deriving of dynamical model is done using Software Maple. Derived Dynamical Model of 5 DoF robot is converted for Matlab use for future analysis, control and simulations.

  16. Proposed Robot Scheme with 5 DoF and Dynamic Modelling Using Maple Software

    OpenAIRE

    Shala Ahmet; Bruçi Mirlind

    2017-01-01

    In this paper is represented Dynamical Modelling of robots which is commonly first important step of Modelling, Analysis and Control of robotic systems. This paper is focused on using Denavit-Hartenberg (DH) convention for kinematics and Newton-Euler Formulations for dynamic modelling of 5 DoF - Degree of Freedom of 3D robot. The process of deriving of dynamical model is done using Software Maple. Derived Dynamical Model of 5 DoF robot is converted for Matlab use for future analysis, control ...

  17. An executable software architecture model for response time and reliability assessment

    Directory of Open Access Journals (Sweden)

    Zahra Norouzi

    2014-09-01

    Full Text Available With the increasing use of the Unified Modeling Language (UML diagrams to describe the software’s architecture and the importance of evaluating nonfunctional requirements at the level of software architecture, creating an executable model from these diagrams is essential. On the other hand, the UML diagrams do not directly provide features to evaluate nonfunctional system requirements. Thus, these capabilities can be added to UML diagrams by applying efficiency and reliability stereotypes. Because the techniques used in the UML is able to deal with certain matters, we develop uncertain UML, stereotypes and tags. In this paper, the architecture of a software system is described by using use case diagram, sequence and deployment of unified modeling language diagrams with annotations fuzzy stereotypes related to response time and reliability. The proposed method for calculating the response time and reliability based on fuzzy rules are introduced, and the algorithm is implemented for an executable model based on colored fuzzy Petri net.

  18. Virtual Systems Pharmacology (ViSP software for mechanistic system-level model simulations

    Directory of Open Access Journals (Sweden)

    Sergey eErmakov

    2014-10-01

    Full Text Available Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user’s particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  19. Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.

    Science.gov (United States)

    Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A

    2014-01-01

    Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.

  20. APIS: a software for model identification, simulation and dosage regimen calculations in clinical and experimental pharmacokinetics.

    Science.gov (United States)

    Iliadis, A; Brown, A C; Huggins, M L

    1992-08-01

    APIS is a software package based on mathematical modelling which provides a reliable approach in optimizing drug therapy. It was designed to assist clinicians in interpreting blood drug levels so that drug therapy may be better and more cost-effective. It is a methodological approach to describe, predict and control the kinetic behaviour of a drug. This software incorporates the principle of Bayesian procedures, i.e. one can use all available patient information (population) to determine patient-specific parameter estimates. These estimates can then be used to design an optimal and individualized drug regimen. APIS is an attractive and useful tool for clinical and experimental pharmacokinetics. APIS may be used on any IBM compatible computer using the Microsoft-Windows environment. The software is menu driven to provide a very user-friendly tool for analysing pharmacokinetic data and for designing dosage regimens.

  1. Modelling the critical success factors of agile software development projects in South Africa

    Directory of Open Access Journals (Sweden)

    Tawanda B. Chiyangwa

    2017-01-01

    Full Text Available Background: The continued in failure of agile and traditional software development projects have led to the consideration, attention and dispute to critical success factors that are the aspects which are most vital to make a software engineering methodology fruitful. Although there is an increasing variety of critical success factors and methodologies, the conceptual frameworks which have causal relationship are limited.Objective: The objective of this study was to identify and provide insights into the critical success factors that influence the success of software development projects using agile methodologies in South Africa.Method: Quantitative method of collecting data was used. Data were collected in South Africa through a Web-based survey using structured questionnaires.Results: These results show that organisational factors have a great influence on performance expectancy characteristics.Conclusion: The results of this study discovered a comprehensive model that could provide guidelines to the agile community and to the agile professionals.

  2. Partition expanders

    Czech Academy of Sciences Publication Activity Database

    Gavinsky, Dmitry; Pudlák, Pavel

    2017-01-01

    Roč. 60, č. 3 (2017), s. 378-395 ISSN 1432-4350 R&D Projects: GA ČR GBP202/12/G061 Institutional support: RVO:67985840 Keywords : expanders * pseudorandomness * communication complexity Subject RIV: BA - General Mathematics Impact factor: 0.645, year: 2016 http://link.springer.com/article/10.1007%2Fs00224-016-9738-5

  3. THE MODEL OF FORMATION OF RESEARCH COMPETENCE OF FUTURE SOFTWARE ENGINEERS

    Directory of Open Access Journals (Sweden)

    N. Osipova

    2014-06-01

    Full Text Available The article analyzes the practical experience, theoretical and methodological backgrounds toformation of research competence of future software engineers. Also in this article we defined the content, structure, criteria and indicators of research competence of future software engineers and characterized levels of the formedness of research competence of futuresoftware engineers and explained main phases of its formation. In consideration of the specificity of formation of research competence of software engineers, job market requirements and social order much attention in article is paid to student participation in research projects of the Chair, particularly in international projects and projects commissioned by the Ministry of Education and Science of Ukraine. The important factor of effective formation of research competence of future software engineers is student's work on chair of scientific schools, their training in IT companies and IT departments of higher education institutions and other educational establishments, including abroad. Also we pay attention to the need of group work of participants of the educational process that can beprovided with their participation in scientific problem groups, scientific schools, work on joint research projects. The conducted research confirms the effectiveness of implementation of the proposed model offormation of research competence of future software engineers.

  4. Development of an Open Source Image-Based Flow Modeling Software - SimVascular

    Science.gov (United States)

    Updegrove, Adam; Merkow, Jameson; Schiavazzi, Daniele; Wilson, Nathan; Marsden, Alison; Shadden, Shawn

    2014-11-01

    SimVascular (www.simvascular.org) is currently the only comprehensive software package that provides a complete pipeline from medical image data segmentation to patient specific blood flow simulation. This software and its derivatives have been used in hundreds of conference abstracts and peer-reviewed journal articles, as well as the foundation of medical startups. SimVascular was initially released in August 2007, yet major challenges and deterrents for new adopters were the requirement of licensing three expensive commercial libraries utilized by the software, a complicated build process, and a lack of documentation, support and organized maintenance. In the past year, the SimVascular team has made significant progress to integrate open source alternatives for the linear solver, solid modeling, and mesh generation commercial libraries required by the original public release. In addition, the build system, available distributions, and graphical user interface have been significantly enhanced. Finally, the software has been updated to enable users to directly run simulations using models and boundary condition values, included in the Vascular Model Repository (vascularmodel.org). In this presentation we will briefly overview the capabilities of the new SimVascular 2.0 release. National Science Foundation.

  5. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  6. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  7. Malware Propagation and Prevention Model for Time-Varying Community Networks within Software Defined Networks

    OpenAIRE

    Lan Liu; Ryan K. L. Ko; Guangming Ren; Xiaoping Xu

    2017-01-01

    As the adoption of Software Defined Networks (SDNs) grows, the security of SDN still has several unaddressed limitations. A key network security research area is in the study of malware propagation across the SDN-enabled networks. To analyze the spreading processes of network malware (e.g., viruses) in SDN, we propose a dynamic model with a time-varying community network, inspired by research models on the spread of epidemics in complex networks across communities. We assume subnets of the ne...

  8. TChem - A Software Toolkit for the Analysis of Complex Kinetic Models

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Knio, Omar [Johns Hopkins Univ., Baltimore, MD (United States)

    2011-05-01

    The TChem toolkit is a software library that enables numerical simulations using complex chemistry and facilitates the analysis of detailed kinetic models. The toolkit provide capabilities for thermodynamic properties based on NASA polynomials and species production/consumption rates. It incorporates methods that can selectively modify reaction parameters for sensitivity analysis. The library contains several functions that provide analytically computed Jacobian matrices necessary for the efficient time advancement and analysis of detailed kinetic models.

  9. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Science.gov (United States)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; hide

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  10. Design of complete software GPS signal simulator with low complexity and precise multipath channel model

    Directory of Open Access Journals (Sweden)

    G. Arul Elango

    2016-09-01

    Full Text Available The need for GPS data simulators have become important due to the tremendous growth in the design of versatile GPS receivers. Commercial hardware and software based GPS simulators are expensive and time consuming. In this work, a low cost simple novel GPS L1 signal simulator is designed for testing and evaluating the performance of software GPS receiver in a laboratory environment. A typical real time paradigm, similar to actual satellite derived GPS signal is created on a computer generated scenario. In this paper, a GPS software simulator is proposed that may offer a lot of analysis and testing flexibility to the researchers and developers as it is totally software based primarily running on a laptop/personal computer without the requirement of any hardware. The proposed GPS simulator allows provision for re-configurability and test repeatability and is developed in VC++ platform to minimize the simulation time. It also incorporates Rayleigh multipath channel fading model under non-line of sight (NLOS conditions. In this work, to efficiently design the simulator, several Rayleigh fading models viz. Inverse Discrete Fourier Transform (IDFT, Filtering White Gaussian Noise (FWFN and modified Sum of Sinusoidal (SOS simulators are tested and compared in terms of accuracy of its first and second order statistical metrics, execution time and the later one is found to be as the best appropriate Rayleigh multipath model suitable for incorporating with GPS simulator. The fading model written in ‘MATLAB’ engine has been linked with software GPS simulator module enable to test GPS receiver’s functionality in different fading environments.

  11. Expanding (3+1)-dimensional universe from a lorentzian matrix model for superstring theory in (9+1) dimensions.

    Science.gov (United States)

    Kim, Sang-Woo; Nishimura, Jun; Tsuchiya, Asato

    2012-01-06

    We reconsider the matrix model formulation of type IIB superstring theory in (9+1)-dimensional space-time. Unlike the previous works in which the Wick rotation was used to make the model well defined, we regularize the Lorentzian model by introducing infrared cutoffs in both the spatial and temporal directions. Monte Carlo studies reveal that the two cutoffs can be removed in the large-N limit and that the theory thus obtained has no parameters other than one scale parameter. Moreover, we find that three out of nine spatial directions start to expand at some "critical time," after which the space has SO(3) symmetry instead of SO(9).

  12. EMMC guidance on quality assurance for academic materials modelling software engineering

    OpenAIRE

    European Materials Modelling Council

    2015-01-01

    Proposed recommendations for software development in LEIT projects. This document presents the advice of software owners, commercial and academic, on what academic software could do to generate better quality software, ready to be used by third parties.

  13. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  14. Challenges encountered when expanding activated sludge models: a case study based on N2O production

    DEFF Research Database (Denmark)

    Snip, Laura; Boiocchi, Riccardo; Flores Alsina, Xavier

    2014-01-01

    (WWTPs). As a consequence, these experiments might not be representative for full-scale performance, and unexpected behaviour may be observed when simulating WWTP models using the derived process equations. In this paper we want to highlight problems encountered using a simplified case study: a modified...... version of the Activated Sludge Model No. 1 (ASM1) is upgraded with nitrous oxide (N2O) formation by ammonia-oxidizing bacteria. Four different model structures have been implemented in the Benchmark Simulation Model No. 1 (BSM1). The results of the investigations revealed two typical difficulties...

  15. Application of DIgSILENT POWERFACTORY software for modeling of industrial network relay protection system

    Directory of Open Access Journals (Sweden)

    Sučević Nikola

    2016-01-01

    Full Text Available This paper presents modeling of industrial network relay protection system using DIgSILENT PowerFactory software. The basis for the model of protection system is a model of a single substation in an industrial network. The paper presents the procedure for modeling of protective devices of 6 kV asynchronous motors, 6/0,4 kV/kV transformers as well as protection in the bus coupler and busbar protection. Protective relay system response for the simulated disturbances is shown in the paper.

  16. Empirical and Face Validity of Software Maintenance Defect Models Used at the Jet Propulsion Laboratory

    Science.gov (United States)

    Taber, William; Port, Dan

    2014-01-01

    At the Mission Design and Navigation Software Group at the Jet Propulsion Laboratory we make use of finite exponential based defect models to aid in maintenance planning and management for our widely used critical systems. However a number of pragmatic issues arise when applying defect models for a post-release system in continuous use. These include: how to utilize information from problem reports rather than testing to drive defect discovery and removal effort, practical model calibration, and alignment of model assumptions with our environment.

  17. Modeling density-driven flow in porous media principles, numerics, software

    CERN Document Server

    Holzbecher, Ekkehard O

    1998-01-01

    Modeling of flow and transport in groundwater has become an important focus of scientific research in recent years. Most contributions to this subject deal with flow situations, where density and viscosity changes in the fluid are neglected. This restriction may not always be justified. The models presented in the book demonstrate immpressingly that the flow pattern may be completely different when density changes are taken into account. The main applications of the models are: thermal and saline convection, geothermal flow, saltwater intrusion, flow through salt formations etc. This book not only presents basic theory, but the reader can also test his knowledge by applying the included software and can set up own models.

  18. Optical Network Models and Their Application to Software-Defined Network Management

    Directory of Open Access Journals (Sweden)

    Thomas Szyrkowiec

    2017-01-01

    Full Text Available Software-defined networking is finding its way into optical networks. Here, it promises a simplification and unification of network management for optical networks allowing automation of operational tasks despite the highly diverse and vendor-specific commercial systems and the complexity and analog nature of optical transmission. Common abstractions and interfaces are a fundamental component for software-defined optical networking. Currently, a number of models for optical networks are available. They all claim to provide open and vendor agnostic management of optical equipment. In this work, we survey and compare the most important models and propose an intent interface for creating virtual topologies which is integrated in the existing model ecosystem.

  19. 3DVEM SOFTWARE MODULES FOR EFFICIENT MANAGEMENT OF POINT CLOUDS AND PHOTOREALISTIC 3D MODELS

    Directory of Open Access Journals (Sweden)

    S. Fabado

    2013-07-01

    Full Text Available Cultural heritage managers in general and information users in particular are not usually used to deal with high-technological hardware and software. On the contrary, information providers of metric surveys are most of the times applying latest developments for real-life conservation and restoration projects. This paper addresses the software issue of handling and managing either 3D point clouds or (photorealistic 3D models to bridge the gap between information users and information providers as regards the management of information which users and providers share as a tool for decision-making, analysis, visualization and management. There are not many viewers specifically designed to handle, manage and create easily animations of architectural and/or archaeological 3D objects, monuments and sites, among others. 3DVEM – 3D Viewer, Editor & Meter software will be introduced to the scientific community, as well as 3DVEM – Live and 3DVEM – Register. The advantages of managing projects with both sets of data, 3D point cloud and photorealistic 3D models, will be introduced. Different visualizations of true documentation projects in the fields of architecture, archaeology and industry will be presented. Emphasis will be driven to highlight the features of new userfriendly software to manage virtual projects. Furthermore, the easiness of creating controlled interactive animations (both walkthrough and fly-through by the user either on-the-fly or as a traditional movie file will be demonstrated through 3DVEM – Live.

  20. 3DVEM Software Modules for Efficient Management of Point Clouds and Photorealistic 3d Models

    Science.gov (United States)

    Fabado, S.; Seguí, A. E.; Cabrelles, M.; Navarro, S.; García-De-San-Miguel, D.; Lerma, J. L.

    2013-07-01

    Cultural heritage managers in general and information users in particular are not usually used to deal with high-technological hardware and software. On the contrary, information providers of metric surveys are most of the times applying latest developments for real-life conservation and restoration projects. This paper addresses the software issue of handling and managing either 3D point clouds or (photorealistic) 3D models to bridge the gap between information users and information providers as regards the management of information which users and providers share as a tool for decision-making, analysis, visualization and management. There are not many viewers specifically designed to handle, manage and create easily animations of architectural and/or archaeological 3D objects, monuments and sites, among others. 3DVEM - 3D Viewer, Editor & Meter software will be introduced to the scientific community, as well as 3DVEM - Live and 3DVEM - Register. The advantages of managing projects with both sets of data, 3D point cloud and photorealistic 3D models, will be introduced. Different visualizations of true documentation projects in the fields of architecture, archaeology and industry will be presented. Emphasis will be driven to highlight the features of new userfriendly software to manage virtual projects. Furthermore, the easiness of creating controlled interactive animations (both walkthrough and fly-through) by the user either on-the-fly or as a traditional movie file will be demonstrated through 3DVEM - Live.

  1. Specialization of the Land Administration Domain Model (LADM) : An Option for Expanding the Legal Profiles

    NARCIS (Netherlands)

    Paasch, J.; Van Oosterom, P.; Paulsson, J.; Lemmen, C.

    2013-01-01

    The Land Administration Domain Model, LADM, passed on the 1st of November 2012 unanimously the final vote towards becoming an international standard, ISO 19152. Based on the standard this paper is a proposal for a more detailed classification of interests in land as modelled within LADM and an

  2. Evaluating Economic Impacts of Expanded Global Wood Energy Consumption with the USFPM/GFPM Model

    Science.gov (United States)

    Peter J. Ince; Andrew Kramp; Kenneth E. Skog

    2012-01-01

    A U.S. forest sector market module was developed within the general Global Forest Products Model. The U.S. module tracks regional timber markets, timber harvests by species group, and timber product outputs in greater detail than does the global model. This hybrid approach provides detailed regional market analysis for the United States although retaining the...

  3. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  4. Model analysis of the chemical conversion of exhaust species in the expanding plumes of subsonic aircraft

    Energy Technology Data Exchange (ETDEWEB)

    Moellhoff, M.; Hendricks, J.; Lippert, E.; Petry, H. [Koeln Univ. (Germany). Inst. fuer Geophysik und Meteorologie; Sausen, R. [Deutsche Forschungsanstalt fuer Luft- und Raumfahrt e.V. (DLR), Oberpfaffenhofen (Germany). Inst. fuer Physik der Atmosphaere

    1997-12-31

    A box model and two different one-dimensional models are used to investigate the chemical conversion of exhaust species in the dispersing plume of a subsonic aircraft flying at cruise altitude. The effect of varying daytime of release as well as the impact of changing dispersion time is studied with special respect to the aircraft induced O{sub 3} production. Effective emission amounts for consideration in mesoscale and global models are calculated. Simulations with modified photolysis rates are performed to show the sensitivity of the photochemistry to the occurrence of cirrus clouds. (author) 8 refs.

  5. Application of Gaseous Sphere Injection Method for Modeling Under-expanded H2 Injection

    Energy Technology Data Exchange (ETDEWEB)

    Whitesides, R; Hessel, R P; Flowers, D L; Aceves, S M

    2010-12-03

    A methodology for modeling gaseous injection has been refined and applied to recent experimental data from the literature. This approach uses a discrete phase analogy to handle gaseous injection, allowing for addition of gaseous injection to a CFD grid without needing to resolve the injector nozzle. This paper focuses on model testing to provide the basis for simulation of hydrogen direct injected internal combustion engines. The model has been updated to be more applicable to full engine simulations, and shows good agreement with experiments for jet penetration and time-dependent axial mass fraction, while available radial mass fraction data is less well predicted.

  6. Deception and Cognitive Load: Expanding Our Horizon with a Working Memory Model

    National Research Council Canada - National Science Library

    Sporer, Siegfried L

    2016-01-01

    ...) working memory model, which integrates verbal and visual processes in working memory with retrieval from long-term memory and control of action, not only verbal content cues but also nonverbal...

  7. Ear-Shaped Stable Auricular Cartilage Engineered from Extensively Expanded Chondrocytes in an Immunocompetent Experimental Animal Model

    Science.gov (United States)

    Pomerantseva, Irina; Bichara, David A.; Tseng, Alan; Cronce, Michael J.; Cervantes, Thomas M.; Kimura, Anya M.; Neville, Craig M.; Roscioli, Nick; Vacanti, Joseph P.; Randolph, Mark A.

    2016-01-01

    Advancement of engineered ear in clinical practice is limited by several challenges. The complex, largely unsupported, three-dimensional auricular neocartilage structure is difficult to maintain. Neocartilage formation is challenging in an immunocompetent host due to active inflammatory and immunological responses. The large number of autologous chondrogenic cells required for engineering an adult human-sized ear presents an additional challenge because primary chondrocytes rapidly dedifferentiate during in vitro culture. The objective of this study was to engineer a stable, human ear-shaped cartilage in an immunocompetent animal model using expanded chondrocytes. The impact of basic fibroblast growth factor (bFGF) supplementation on achieving clinically relevant expansion of primary sheep chondrocytes by in vitro culture was determined. Chondrocytes expanded in standard medium were either combined with cryopreserved, primary passage 0 chondrocytes at the time of scaffold seeding or used alone as control. Disk and human ear-shaped scaffolds were made from porous collagen; ear scaffolds had an embedded, supporting titanium wire framework. Autologous chondrocyte-seeded scaffolds were implanted subcutaneously in sheep after 2 weeks of in vitro incubation. The quality of the resulting neocartilage and its stability and retention of the original ear size and shape were evaluated at 6, 12, and 20 weeks postimplantation. Neocartilage produced from chondrocytes that were expanded in the presence of bFGF was superior, and its quality improved with increased implantation time. In addition to characteristic morphological cartilage features, its glycosaminoglycan content was high and marked elastin fiber formation was present. The overall shape of engineered ears was preserved at 20 weeks postimplantation, and the dimensional changes did not exceed 10%. The wire frame within the engineered ear was able to withstand mechanical forces during wound healing and neocartilage

  8. Expanded modeling of temperature-dependent dielectric properties for microwave thermal ablation.

    Science.gov (United States)

    Ji, Zhen; Brace, Christopher L

    2011-08-21

    Microwaves are a promising source for thermal tumor ablation due to their ability to rapidly heat dispersive biological tissues, often to temperatures in excess of 100 °C. At these high temperatures, tissue dielectric properties change rapidly and, thus, so do the characteristics of energy delivery. Precise knowledge of how tissue dielectric properties change during microwave heating promises to facilitate more accurate simulation of device performance and helps optimize device geometry and energy delivery parameters. In this study, we measured the dielectric properties of liver tissue during high-temperature microwave heating. The resulting data were compiled into either a sigmoidal function of temperature or an integration of the time-temperature curve for both relative permittivity and effective conductivity. Coupled electromagnetic-thermal simulations of heating produced by a single monopole antenna using the new models were then compared to simulations with existing linear and static models, and experimental temperatures in liver tissue. The new sigmoidal temperature-dependent model more accurately predicted experimental temperatures when compared to temperature-time integrated or existing models. The mean percent differences between simulated and experimental temperatures over all times were 4.2% for sigmoidal, 10.1% for temperature-time integration, 27.0% for linear and 32.8% for static models at the antenna input power of 50 W. Correcting for tissue contraction improved agreement for powers up to 75 W. The sigmoidal model also predicted substantial changes in heating pattern due to dehydration. We can conclude from these studies that a sigmoidal model of tissue dielectric properties improves prediction of experimental results. More work is needed to refine and generalize this model.

  9. Expanding a dynamic flux balance model of yeast fermentation to genome-scale

    OpenAIRE

    Vargas, Felipe A; Pizarro, Francisco; Pérez-Correa, J Ricardo; Agosin, Eduardo

    2011-01-01

    Abstract Background Yeast is considered to be a workhorse of the biotechnology industry for the production of many value-added chemicals, alcoholic beverages and biofuels. Optimization of the fermentation is a challenging task that greatly benefits from dynamic models able to accurately describe and predict the fermentation profile and resulting products under different genetic and environmental conditions. In this article, we developed and validated a genome-scale dynamic flux balance model,...

  10. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    Science.gov (United States)

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  11. A free software for the evaluation and comparison of dose response models in clinical radiotherapy (DORES).

    Science.gov (United States)

    Tsougos, Ioannis; Grout, Ioannis; Theodorou, Kyriaki; Kappas, Constantin

    2009-03-01

    The aim of this work was to develop a user-friendly and simple tool for fast and accurate estimation of Normal Tissue Complication Probabilities (NTCP) for several radiobiological models, which can be used as a valuable complement to the clinical experience. The software which has been named DORES (Dose Response Evaluation Software) has been developed in Visual Basic, and includes three NTCP models (Lyman-Kuther-Burman (LKB), Relative Seriality and Parallel). Required input information includes the Dose-Volume Histogram (DVH) for the Organs at Risk (OAR) of each treatment, the number of fractions and the total dose of therapy. NTCP values are computed, and subsequently placed in a spreadsheet file for further analysis. A Dose Response curve for every model is automatically generated. Every patient of the study population can be found on the curve since by definition their corresponding dose-response points fall exactly on the theoretical dose-response curve, when plotted on the same diagram. Distributions of absorbed dose alone do not provide information on the biological response of tissues to irradiation, so the use of this software may aid in the comparison of outcomes for different treatment plans or types of treatment, and also aid the evaluation of the sensitivity of different model predictions to uncertainties in parameter values. This was illustrated in a clinical case of breast cancer radiotherapy.

  12. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    Science.gov (United States)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  13. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    Science.gov (United States)

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society

  14. Expanding Model Independent Approaches for Measuring the CKM angle $\\gamma$ at LHCb

    CERN Multimedia

    Prouve, Claire

    2017-01-01

    Model independent approaches to measuring the CKM angle $\\gamma$ in $B\\rightarrow DK$ decays at LHCb are explored. In particular, we consider the case where the $D$ meson decays into a final state with four hadrons. Using four-body final states such as $\\pi^+ \\pi^- \\pi^+ \\pi^-$, $K^+ \\pi^- \\pi^+ \\pi^-$ and $K^+ K^- \\pi^+ \\pi^-$ in addition to traditional 2 and 3 body states and has the potential to significantly improve to the overall constraint on $\\gamma$. There is a significant systematic uncertainty associated with modelling the complex phase of the $D$ decay amplitude across the five-dimensional phase space of the four body decay. It is therefore important to replace these model-dependent quantities with model-independent parameters as input for the $\\gamma$ measurement. These model independent parameters have been measured using quantum-correlated $\\psi(3770) \\rightarrow D^0 \\overline{D^0}$ decays collected by the CLEO-c experiment, and, for $D\\rightarrow K^+ \\pi^- \\pi^+ \\pi^-$, with $D^0-\\overline{D^0...

  15. Mobile Agent-Based Software Systems Modeling Approaches: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Aissam Belghiat

    2016-06-01

    Full Text Available Mobile agent-based applications are special type of software systems which take the advantages of mobile agents in order to provide a new beneficial paradigm to solve multiple complex problems in several fields and areas such as network management, e-commerce, e-learning, etc. Likewise, we notice lack of real applications based on this paradigm and lack of serious evaluations of their modeling approaches. Hence, this paper provides a comparative study of modeling approaches of mobile agent-based software systems. The objective is to give the reader an overview and a thorough understanding of the work that has been done and where the gaps in the research are.

  16. Specialization of the Land Administration Domain Model (LADM): An Option for Expanding the Legal Profiles

    OpenAIRE

    Paasch, J.; Van Oosterom, P.; Paulsson, J.; C. Lemmen

    2013-01-01

    The Land Administration Domain Model, LADM, passed on the 1st of November 2012 unanimously the final vote towards becoming an international standard, ISO 19152. Based on the standard this paper is a proposal for a more detailed classification of interests in land as modelled within LADM and an attempt to raise the awareness of the possibilities to further develop the LADM?s “right”, “restriction” and “responsibility” (RRR) classes. The current standardised classification of RRRs in the LADM i...

  17. Mathematical model and software for investigation of internal ballistic processes in high-speed projectile installations

    Science.gov (United States)

    Diachkovskii, A. S.; Zykova, A. I.; Ishchenko, A. N.; Kasimov, V. Z.; Rogaev, K. S.; Sidorov, A. D.

    2017-11-01

    This paper describes a software package that allows to explore the interior ballistics processes occurring in a shot scheme with bulk charges using propellant pasty substances at various loading schemes, etc. As a mathematical model, a model of a polydisperse mixture of non-deformable particles and a carrier gas phase is used in the quasi-one-dimensional approximation. Writing the equations of the mathematical model allows to use it to describe a broad class of interior ballistics processes. Features of the using approach are illustrated by calculating the ignition period for the charge of tubular propellant.

  18. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  19. Igpet software for modeling igneous processes: examples of application using the open educational version

    Science.gov (United States)

    Carr, Michael J.; Gazel, Esteban

    2017-04-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  20. MODSARE-V: Validation of Dependability and Safety Critical Software Components with Model Based Requirements

    Science.gov (United States)

    Silveira, Daniel T. de M. M.; Schoofs, Tobias; Alana Salazar, Elena; Rodriguez Rodriguez, Ana Isabel; Devic, Marie-Odile

    2010-08-01

    The wide use of RAMS methods and techniques [1] (e.g. SFMECA, SFTA, HAZOP, HA...) in critical software development resulted in the specification of new software requirements, design constraints and other issues such as mandatory coding rules. Given the large variety of RAMS Requirements and Techniques, different types of Verification and Validation (V&V) [14] are spread over the phases of the software engineering process. As a result, the V&V process becomes complex and the cost and time required for a complete and consistent V&V process is increased. By introducing the concept of a model based approach to facilitate the RAMS requirements definition process, the V&V may be reduce in time and effort. MODSARE-V is demonstrates the feasibility of this concept based on case studies applied to ground or on-board software space projects with critical functions/components. This paper describes the approach adopted at MODSARE-V to realize the concept into a prototype and summarizes the results and conclusions met after the prototype application on the case studies.

  1. Expanding the "ports of entry" for speech-language pathologists: a relational and reflective model for clinical practice.

    Science.gov (United States)

    Geller, Elaine; Foley, Gilbert M

    2009-02-01

    To outline an expanded framework for clinical practice in speech-language pathology. This framework broadens the focus on discipline-specific knowledge and infuses mental health constructs within the study of communication sciences and disorders, with the objective of expanding the potential "ports or points of entry" (D. Stern, 1995) for clinical intervention with young children who are language impaired. Specific mental health constructs are highlighted in this article. These include relationship-based learning, attachment theory, working dyadically (the client is the child and parent), reflective practice, transference-countertransference, and the use of self. Each construct is explored as to the way it has been applied in traditional and contemporary models of clinical practice. The underlying premise in this framework is that working from a relationally based and reflective perspective augments change and growth in both client and parent(s). The challenge is for speech-language pathologists to embed mental health constructs within their discipline-specific expertise. This leads to paying attention to both observable aspects of clients' behaviors as well as their internal affective states.

  2. Anti-leukemia activity of in vitro-expanded human gamma delta T cells in a xenogeneic Ph+ leukemia model.

    Directory of Open Access Journals (Sweden)

    Gabrielle M Siegers

    Full Text Available Gamma delta T cells (GDTc lyse a variety of hematological and solid tumour cells in vitro and in vivo, and are thus promising candidates for cellular immunotherapy. We have developed a protocol to expand human GDTc in vitro, yielding highly cytotoxic Vgamma9/Vdelta2 CD27/CD45RA double negative effector memory cells. These cells express CD16, CD45RO, CD56, CD95 and NKG2D. Flow cytometric, clonogenic, and chromium release assays confirmed their specific cytotoxicity against Ph(+ cell lines in vitro. We have generated a fluorescent and bioluminescent Ph(+ cell line, EM-2eGFPluc, and established a novel xenogeneic leukemia model. Intravenous injection of EM-2eGFPluc into NOD.Cg-Prkdcscid Il2rgtm1Wjl/SzJ (NSG mice resulted in significant dose-dependent bone marrow engraftment; lower levels engrafted in blood, lung, liver and spleen. In vitro-expanded human GDTc injected intraperitoneally were found at higher levels in blood and organs compared to those injected intravenously; GDTc survived at least 33 days post-injection. In therapy experiments, we documented decreased bone marrow leukemia burden in mice treated with GDTc. Live GDTc were found in spleen and bone marrow at endpoint, suggesting the potential usefulness of this therapy.

  3. Enhanced Bayesian modelling in BAPS software for learning genetic structures of populations

    Directory of Open Access Journals (Sweden)

    Sirén Jukka

    2008-12-01

    Full Text Available Abstract Background During the most recent decade many Bayesian statistical models and software for answering questions related to the genetic structure underlying population samples have appeared in the scientific literature. Most of these methods utilize molecular markers for the inferences, while some are also capable of handling DNA sequence data. In a number of earlier works, we have introduced an array of statistical methods for population genetic inference that are implemented in the software BAPS. However, the complexity of biological problems related to genetic structure analysis keeps increasing such that in many cases the current methods may provide either inappropriate or insufficient solutions. Results We discuss the necessity of enhancing the statistical approaches to face the challenges posed by the ever-increasing amounts of molecular data generated by scientists over a wide range of research areas and introduce an array of new statistical tools implemented in the most recent version of BAPS. With these methods it is possible, e.g., to fit genetic mixture models using user-specified numbers of clusters and to estimate levels of admixture under a genetic linkage model. Also, alleles representing a different ancestry compared to the average observed genomic positions can be tracked for the sampled individuals, and a priori specified hypotheses about genetic population structure can be directly compared using Bayes' theorem. In general, we have improved further the computational characteristics of the algorithms behind the methods implemented in BAPS facilitating the analyses of large and complex datasets. In particular, analysis of a single dataset can now be spread over multiple computers using a script interface to the software. Conclusion The Bayesian modelling methods introduced in this article represent an array of enhanced tools for learning the genetic structure of populations. Their implementations in the BAPS software are

  4. Expanding Health Care Access Through Education: Dissemination and Implementation of the ECHO Model.

    Science.gov (United States)

    Katzman, Joanna G; Galloway, Kevin; Olivas, Cynthia; McCoy-Stafford, Kimberly; Duhigg, Daniel; Comerci, George; Kalishman, Summers; Buckenmaier, Chester C; McGhee, Laura; Joltes, Kristin; Bradford, Andrea; Shelley, Brian; Hernandez, Jessica; Arora, Sanjeev

    2016-03-01

    Project ECHO (Extension for Community Healthcare Outcomes) is an evidence-based model that provides high-quality medical education for common and complex diseases through telementoring and comanagement of patients with primary care clinicians. In a one to many knowledge network, the ECHO model helps to bridge the gap between primary care clinicians and specialists by enhancing the knowledge, skills, confidence, and practice of primary care clinicians in their local communities. As a result, patients in rural and urban underserved areas are able to receive best practice care without long waits or having to travel long distances. The ECHO model has been replicated in 43 university hubs in the United States and five other countries. A new replication tool was developed by the Project ECHO Pain team and U.S. Army Medical Command to ensure a high-fidelity replication of the model. The adoption of the tool led to successful replication of ECHO in the Army Pain initiative. This replication tool has the potential to improve the fidelity of ECHO replication efforts around the world. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  5. Deception and Cognitive Load: Expanding Our Horizon with a Working Memory Model

    Science.gov (United States)

    Sporer, Siegfried L.

    2016-01-01

    Recently, studies on deception and its detection have increased dramatically. Many of these studies rely on the “cognitive load approach” as the sole explanatory principle to understand deception. These studies have been exclusively on lies about negative actions (usually lies of suspects of [mock] crimes). Instead, we need to re-focus more generally on the cognitive processes involved in generating both lies and truths, not just on manipulations of cognitive load. Using Baddeley’s (2000, 2007, 2012) working memory model, which integrates verbal and visual processes in working memory with retrieval from long-term memory and control of action, not only verbal content cues but also nonverbal, paraverbal, and linguistic cues can be investigated within a single framework. The proposed model considers long-term semantic, episodic and autobiographical memory and their connections with working memory and action. It also incorporates ironic processes of mental control (Wegner, 1994, 2009), the role of scripts and schemata and retrieval cues and retrieval processes. Specific predictions of the model are outlined and support from selective studies is presented. The model is applicable to different types of reports, particularly about lies and truths about complex events, and to different modes of production (oral, hand-written, typed). Predictions regarding several moderator variables and methods to investigate them are proposed. PMID:27092090

  6. Deception and Cognitive Load: Expanding Our Horizon with a Working Memory Model.

    Science.gov (United States)

    Sporer, Siegfried L

    2016-01-01

    Recently, studies on deception and its detection have increased dramatically. Many of these studies rely on the "cognitive load approach" as the sole explanatory principle to understand deception. These studies have been exclusively on lies about negative actions (usually lies of suspects of [mock] crimes). Instead, we need to re-focus more generally on the cognitive processes involved in generating both lies and truths, not just on manipulations of cognitive load. Using Baddeley's (2000, 2007, 2012) working memory model, which integrates verbal and visual processes in working memory with retrieval from long-term memory and control of action, not only verbal content cues but also nonverbal, paraverbal, and linguistic cues can be investigated within a single framework. The proposed model considers long-term semantic, episodic and autobiographical memory and their connections with working memory and action. It also incorporates ironic processes of mental control (Wegner, 1994, 2009), the role of scripts and schemata and retrieval cues and retrieval processes. Specific predictions of the model are outlined and support from selective studies is presented. The model is applicable to different types of reports, particularly about lies and truths about complex events, and to different modes of production (oral, hand-written, typed). Predictions regarding several moderator variables and methods to investigate them are proposed.

  7. Expanding a dynamic flux balance model of yeast fermentation to genome-scale

    Directory of Open Access Journals (Sweden)

    Agosin Eduardo

    2011-05-01

    Full Text Available Abstract Background Yeast is considered to be a workhorse of the biotechnology industry for the production of many value-added chemicals, alcoholic beverages and biofuels. Optimization of the fermentation is a challenging task that greatly benefits from dynamic models able to accurately describe and predict the fermentation profile and resulting products under different genetic and environmental conditions. In this article, we developed and validated a genome-scale dynamic flux balance model, using experimentally determined kinetic constraints. Results Appropriate equations for maintenance, biomass composition, anaerobic metabolism and nutrient uptake are key to improve model performance, especially for predicting glycerol and ethanol synthesis. Prediction profiles of synthesis and consumption of the main metabolites involved in alcoholic fermentation closely agreed with experimental data obtained from numerous lab and industrial fermentations under different environmental conditions. Finally, fermentation simulations of genetically engineered yeasts closely reproduced previously reported experimental results regarding final concentrations of the main fermentation products such as ethanol and glycerol. Conclusion A useful tool to describe, understand and predict metabolite production in batch yeast cultures was developed. The resulting model, if used wisely, could help to search for new metabolic engineering strategies to manage ethanol content in batch fermentations.

  8. Expanding a dynamic flux balance model of yeast fermentation to genome-scale.

    Science.gov (United States)

    Vargas, Felipe A; Pizarro, Francisco; Pérez-Correa, J Ricardo; Agosin, Eduardo

    2011-05-19

    Yeast is considered to be a workhorse of the biotechnology industry for the production of many value-added chemicals, alcoholic beverages and biofuels. Optimization of the fermentation is a challenging task that greatly benefits from dynamic models able to accurately describe and predict the fermentation profile and resulting products under different genetic and environmental conditions. In this article, we developed and validated a genome-scale dynamic flux balance model, using experimentally determined kinetic constraints. Appropriate equations for maintenance, biomass composition, anaerobic metabolism and nutrient uptake are key to improve model performance, especially for predicting glycerol and ethanol synthesis. Prediction profiles of synthesis and consumption of the main metabolites involved in alcoholic fermentation closely agreed with experimental data obtained from numerous lab and industrial fermentations under different environmental conditions. Finally, fermentation simulations of genetically engineered yeasts closely reproduced previously reported experimental results regarding final concentrations of the main fermentation products such as ethanol and glycerol. A useful tool to describe, understand and predict metabolite production in batch yeast cultures was developed. The resulting model, if used wisely, could help to search for new metabolic engineering strategies to manage ethanol content in batch fermentations.

  9. Expanding business-to-business customer relationships : modeling the customer's upgrade decision

    NARCIS (Netherlands)

    Bolton, R.; Lemon, K.N.; Verhoef, P.C.

    This article develops a model of a business customer's decision to upgrade service contracts conditional on the decision to renew the contract. It proposes that the firm's upgrade decision is influenced by (1) decision-maker perceptions of the relationship with the supplier, (2) contract-level

  10. Assessment of Software Modeling Techniques for Wireless Sensor Networks: A Survey

    Directory of Open Access Journals (Sweden)

    John Khalil Jacoub

    2012-03-01

    Full Text Available Wireless Sensor Networks (WSNs monitor environment phenomena and in some cases react in response to the observed phenomena. The distributed nature of WSNs and the interaction between software and hardware components makes it difficult to correctly design and develop WSN systems. One solution to the WSN design challenges is system modeling. In this paper we present a survey of 9 WSN modeling techniques and show how each technique models different parts of the system such as sensor behavior, sensor data and hardware. Furthermore, we consider how each modeling technique represents the network behavior and network topology. We also consider the available supporting tools for each of the modeling techniques. Based on the survey, we classify the modeling techniques and derive examples of the surveyed modeling techniques by using SensIV system.

  11. 3D modeling of high-Tc superconductors by finite element software

    Science.gov (United States)

    Zhang, Min; Coombs, T. A.

    2012-01-01

    A three-dimensional (3D) numerical model is proposed to solve the electromagnetic problems involving transport current and background field of a high-Tc superconducting (HTS) system. The model is characterized by the E-J power law and H-formulation, and is successfully implemented using finite element software. We first discuss the model in detail, including the mesh methods, boundary conditions and computing time. To validate the 3D model, we calculate the ac loss and trapped field solution for a bulk material and compare the results with the previously verified 2D solutions and an analytical solution. We then apply our model to test some typical problems such as superconducting bulk array and twisted conductors, which cannot be tackled by the 2D models. The new 3D model could be a powerful tool for researchers and engineers to investigate problems with a greater level of complicity.

  12. Integration of drinking water treatment plant process models and emulated process automation software

    OpenAIRE

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system. The use of the system is investigated in the three identified applications, i) optimization of process control, ii) training of operation supervisors and iii) virtual commissioning of process autom...

  13. Software Tools For Building Decision-support Models For Flood Emergency Situations

    Science.gov (United States)

    Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.

    The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.

  14. Expanding the Range of Plant Functional Diversity Represented in Global Vegetation Models: Towards Lineage-based Plant Functional Types

    Science.gov (United States)

    Still, C. J.; Griffith, D.; Edwards, E.; Forrestel, E.; Lehmann, C.; Anderson, M.; Craine, J.; Pau, S.; Osborne, C.

    2014-12-01

    Variation in plant species traits, such as photosynthetic and hydraulic properties, can indicate vulnerability or resilience to climate change, and feed back to broad-scale spatial and temporal patterns in biogeochemistry, demographics, and biogeography. Yet, predicting how vegetation will respond to future environmental changes is severely limited by the inability of our models to represent species-level trait variation in processes and properties, as current generation process-based models are mostly based on the generalized and abstracted concept of plant functional types (PFTs) which were originally developed for hydrological modeling. For example, there are close to 11,000 grass species, but most vegetation models have only a single C4 grass and one or two C3 grass PFTs. However, while species trait databases are expanding rapidly, they have been produced mostly from unstructured research, with a focus on easily researched traits that are not necessarily the most important for determining plant function. Additionally, implementing realistic species-level trait variation in models is challenging. Combining related and ecologically similar species in these models might ameliorate this limitation. Here we argue for an intermediate, lineage-based approach to PFTs, which draws upon recent advances in gene sequencing and phylogenetic modeling, and where trait complex variations and anatomical features are constrained by a shared evolutionary history. We provide an example of this approach with grass lineages that vary in photosynthetic pathway (C3 or C4) and other functional and structural traits. We use machine learning approaches and geospatial databases to infer the most important environmental controls and climate niche variation for the distribution of grass lineages, and utilize a rapidly expanding grass trait database to demonstrate examples of lineage-based grass PFTs. For example, grasses in the Andropogoneae are typically tall species that dominate wet and

  15. An expanded Notch-Delta model exhibiting long-range patterning and incorporating MicroRNA regulation.

    Directory of Open Access Journals (Sweden)

    Jerry S Chen

    2014-06-01

    Full Text Available Notch-Delta signaling is a fundamental cell-cell communication mechanism that governs the differentiation of many cell types. Most existing mathematical models of Notch-Delta signaling are based on a feedback loop between Notch and Delta leading to lateral inhibition of neighboring cells. These models result in a checkerboard spatial pattern whereby adjacent cells express opposing levels of Notch and Delta, leading to alternate cell fates. However, a growing body of biological evidence suggests that Notch-Delta signaling produces other patterns that are not checkerboard, and therefore a new model is needed. Here, we present an expanded Notch-Delta model that builds upon previous models, adding a local Notch activity gradient, which affects long-range patterning, and the activity of a regulatory microRNA. This model is motivated by our experiments in the ascidian Ciona intestinalis showing that the peripheral sensory neurons, whose specification is in part regulated by the coordinate activity of Notch-Delta signaling and the microRNA miR-124, exhibit a sparse spatial pattern whereby consecutive neurons may be spaced over a dozen cells apart. We perform rigorous stability and bifurcation analyses, and demonstrate that our model is able to accurately explain and reproduce the neuronal pattern in Ciona. Using Monte Carlo simulations of our model along with miR-124 transgene over-expression assays, we demonstrate that the activity of miR-124 can be incorporated into the Notch decay rate parameter of our model. Finally, we motivate the general applicability of our model to Notch-Delta signaling in other animals by providing evidence that microRNAs regulate Notch-Delta signaling in analogous cell types in other organisms, and by discussing evidence in other organisms of sparse spatial patterns in tissues where Notch-Delta signaling is active.

  16. Expanding the Role of Systems Modeling: Considering Byproduct Generation from Biofuel Production

    Directory of Open Access Journals (Sweden)

    Kurt A. Rosentrater

    2006-06-01

    Full Text Available The bioethanol industry has been experiencing rapid growth over the past several years, and is expected to continue to increase production for the foreseeable future. A vital component to the success of this industry is the sales and marketing of processing residues, which are primarily sold as dried distillers grains with solubles (DDGS. Systems modeling, a technique that has been used to predict future demand for bioethanol, can also be used to determine potential byproduct generation rates. This paper discusses the development of one such model, and presents predicted generation of DDGS as well as carbon dioxide emissions from this industry through 2100. These simulation results underscore the growing need to actively pursue research focused on value-added alternatives for the use of bioethanol byproduct streams.

  17. Expanding the Technology Acceptance Model to Examine Internet Banking Adoption in Tunisia Country

    OpenAIRE

    Wadie Nasri; Charfeddine Lanouar; Anis Allagui

    2013-01-01

    This paper aims to empirically examine the factors that affect the adoption of Internet banking in Tunisia. In order to explain the factors, this paper extends the “Technology Acceptance Model†by adding additional external factors such as security and privacy, self efficacy, social influence, and awareness of services and its benefits. The findings of the study suggests that the security and privacy, self efficacy, social influence, and awareness of services and its benefits have signific...

  18. Enhanced Resilience Through Expanded Community Preparedness in the United States: Application of Israeli Models

    Science.gov (United States)

    2014-03-01

    model in the United States include education and training for youth, as well as mandatory national service for most citizens. Based upon the findings...and recovery. FEMA Corps is composed of approximately 1,000 members, who are 18–24 years of age, and have committed to a year of national service within...President’s Call to Service Award given in recognition of 4,000 hours of volunteer service over a lifetime. Expansion of national service in the United

  19. Expanded Dengue.

    Science.gov (United States)

    Kadam, D B; Salvi, Sonali; Chandanwale, Ajay

    2016-07-01

    The World Health Organization (WHO) has coined the term expanded dengue to describe cases which do not fall into either dengue shock syndrome or dengue hemorrhagic fever. This has incorporated several atypical findings of dengue. Dengue virus has not been enlisted as a common etiological agent in several conditions like encephalitis, Guillain Barre syndrome. Moreover it is a great mimic of co-existing epidemics like Malaria, Chikungunya and Zika virus disease, which are also mosquito-borne diseases. The atypical manifestations noted in dengue can be mutisystemic and multifacetal. In clinical practice, the occurrence of atypical presentation should prompt us to investigate for dengue. Knowledge of expanded dengue helps to clinch the diagnosis of dengue early, especially during ongoing epidemics, avoiding further battery of investigations. Dengue has proved to be the epidemic with the ability to recur and has a diverse array of presentation as seen in large series from India, Srilanka, Indonesia and Taiwan. WHO has given the case definition of dengue fever in their comprehensive guidelines. Accordingly, a probable case is defined as acute febrile illness with two or more of any findings viz. headache, retro-orbital pain, myalgia, arthralgia, rash, hemorrhagic manifestations, leucopenia and supportive serology. There have been cases of patients admitted with fever, altered mentation with or without neck stiffness and pyramidal tract signs. Some had seizures or status epilepticus as presentation. When they were tested for serology, dengue was positive. After ruling out other causes, dengue remained the only culprit. We have come across varied presentations of dengue fever in clinical practice and the present article throws light on atypical manifestations of dengue. © Journal of the Association of Physicians of India 2011.

  20. Actinomycin D Specifically Reduces Expanded CUG Repeat RNA in Myotonic Dystrophy Models

    Directory of Open Access Journals (Sweden)

    Ruth B. Siboni

    2015-12-01

    Full Text Available Myotonic dystrophy type 1 (DM1 is an inherited disease characterized by the inability to relax contracted muscles. Affected individuals carry large CTG expansions that are toxic when transcribed. One possible treatment approach is to reduce or eliminate transcription of CTG repeats. Actinomycin D (ActD is a potent transcription inhibitor and FDA-approved chemotherapeutic that binds GC-rich DNA with high affinity. Here, we report that ActD decreased CUG transcript levels in a dose-dependent manner in DM1 cell and mouse models at significantly lower concentrations (nanomolar compared to its use as a general transcription inhibitor or chemotherapeutic. ActD also significantly reversed DM1-associated splicing defects in a DM1 mouse model, and did so within the currently approved human treatment range. RNA-seq analyses showed that low concentrations of ActD did not globally inhibit transcription in a DM1 mouse model. These results indicate that transcription inhibition of CTG expansions is a promising treatment approach for DM1.

  1. Reducing Software Failures: addressing the ethical risks of the software development lifecycle

    OpenAIRE

    Don Gotterbam

    2002-01-01

    A narrow approach to risk analysis and understanding the scope of a software project has contributed to significant software failures. A process is presented which expands the concept of software risk to include social, professional, and ethical risks that lead to software failure. Using an expanded risk analysis will enlarge the project scope considered by software developers. This process also is incorporated into a software development life cycle. A tool to develop Software Development Imp...

  2. An expanded model: flood-inundation maps for the Leaf River at Hattiesburg, Mississippi, 2013

    Science.gov (United States)

    Storm, John B.

    2014-01-01

    Digital flood-inundation maps for a 6.8-mile reach of the Leaf River at Hattiesburg, Mississippi (Miss.), were created by the U.S. Geological Survey (USGS) in cooperation with the City of Hattiesburg, City of Petal, Forrest County, Mississippi Emergency Management Agency, Mississippi Department of Homeland Security, and the Emergency Management District. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage at Leaf River at Hattiesburg, Miss. (station no. 02473000). Current conditions for estimating near-real-time areas of inundation by use of USGS streamgage information may be obtained on the Internet at http://waterdata.usgs.gov/. In addition, the information has been provided to the National Weather Service (NWS) for incorporation into their Advanced Hydrologic Prediction Service (AHPS) flood warning system (http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs at many places that are often colocated with USGS streamgages. NWS-forecasted peak-stage information may be used in conjunction with the maps developed in this study to show predicted areas of flood inundation. In this study, flood profiles were computed for the stream reach by means of a one-dimensional step-backwater model. The model was calibrated by using the most current stage-discharge relations at the Leaf River at Hattiesburg, Miss. streamgage (02473000) and documented high-water marks from recent and historical floods. The hydraulic model was then used to determine 13 water-surface profiles for flood stages at 1.0-foot intervals referenced to the streamgage datum and ranging from bankfull to approximately the highest recorded water level at the streamgage. The simulated water-surface profiles were then combined with a geographic information system (GIS

  3. Using Mathematica software for coal gasification simulations – Selected kinetic model application

    Directory of Open Access Journals (Sweden)

    Sebastian Iwaszenko

    2015-01-01

    Full Text Available Coal gasification is recognized as a one of promising Clean Coal Technologies. As the process itself is complicated and technologically demanding, it is subject of many research. In the paper a problem of using volumetric, non-reactive core and Johnson model for coal gasification and underground coal gasification is considered. The usage of Mathematica software for models' equations solving and analysis is presented. Coal parameters were estimated for five Polish mines: Piast, Ziemowit, Janina, Szczygłowice and Bobrek. For each coal the models' parameters were determined. The determination of parameters was based on reactivity assessment for 50% char conversion. The calculations show relatively small differences between conversion predicted by volumetric and non reactive core model. More significant differences were observed for Johnson model, but they do not exceeded 10% for final char conversion. The conceptual model for underground coal gasification was presented.

  4. Spreading and vanishing in a West Nile virus model with expanding fronts

    Science.gov (United States)

    Tarboush, Abdelrazig K.; Lin, ZhiGui; Zhang, MengYun

    2017-05-01

    In this paper, we study a simplified version of a West Nile virus model discussed by Lewis et al. [28], which was considered as a first approximation for the spatial spread of WNv. The basic reproduction number $R_0$ for the non-spatial epidemic model is defined and a threshold parameter $R_0 ^D$ for the corresponding problem with null Dirichlet boundary condition is introduced. We consider a free boundary problem with coupled system, which describes the diffusion of birds by a PDE and the movement of mosquitoes by a ODE. The risk index $R_0^F (t)$ associated with the disease in spatial setting is represented. Sufficient conditions for the WNv to eradicate or to spread are given. The asymptotic behavior of the solution to system when the spreading occurs are considered. It is shown that the initial number of infected populations, the diffusion rate of birds and the length of initial habitat exhibit important impacts on the vanishing or spreading of the virus. Numerical simulations are presented to illustrate the analytical results.

  5. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Nielsen Lars K

    2009-05-01

    Full Text Available Abstract Background The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i tracer cultivation on 13C substrates, (ii 13C labelling analysis by mass spectrometry and (iii mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. Results We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly ( Conclusion We have developed a fast, accurate application to perform steady-state 13C metabolic flux analysis. OpenFLUX will strongly facilitate and

  6. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis.

    Science.gov (United States)

    Quek, Lake-Ee; Wittmann, Christoph; Nielsen, Lars K; Krömer, Jens O

    2009-05-01

    The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i) tracer cultivation on 13C substrates, (ii) 13C labelling analysis by mass spectrometry and (iii) mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU) framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly (studies. By providing the software open source, we hope it will evolve with the rapidly growing field of fluxomics.

  7. Near Real-time GNSS-based Ionospheric Model using Expanded Kriging in the East Asia Region

    Science.gov (United States)

    Choi, P. H.; Bang, E.; Lee, J.

    2016-12-01

    Many applications which utilize radio waves (e.g. navigation, communications, and radio sciences) are influenced by the ionosphere. The technology to provide global ionospheric maps (GIM) which show ionospheric Total Electron Content (TEC) has been progressed by processing GNSS data. However, the GIMs have limited spatial resolution (e.g. 2.5° in latitude and 5° in longitude), because they are generated using globally-distributed and thus relatively sparse GNSS reference station networks. This study presents a near real-time and high spatial resolution TEC model over East Asia by using ionospheric observables from both International GNSS Service (IGS) and local GNSS networks and the expanded kriging method. New signals from multi-constellation (e.g,, GPS L5, Galileo E5) were also used to generate high-precision TEC estimates. The newly proposed estimation method is based on the universal kriging interpolation technique, but integrates TEC data from previous epochs to those from the current epoch to improve the TEC estimation performance by increasing ionospheric observability. To propagate previous measurements to the current epoch, we implemented a Kalman filter whose dynamic model was derived by using the first-order Gauss-Markov process which characterizes temporal ionospheric changes under the nominal ionospheric conditions. Along with the TEC estimates at grids, the method generates the confidence bounds on the estimates using resulting estimation covariance. We also suggest to classify the confidence bounds into several categories to allow users to recognize the quality levels of TEC estimates according to the requirements for user's applications. This paper examines the performance of the proposed method by obtaining estimation results for both nominal and disturbed ionospheric conditions, and compares these results to those provided by GIM of the NASA Jet propulsion Laboratory. In addition, the estimation results based on the expanded kriging method are

  8. Likelihood Analysis of Multivariate Probit Models Using a Parameter Expanded MCEM Algorithm.

    Science.gov (United States)

    Xu, Huiping; Craig, Bruce A

    2010-08-01

    Multivariate binary data arise in a variety of settings. In this paper, we propose a practical and efficient computational framework for maximum likelihood estimation of multivariate probit regression models. This approach uses the Monte Carlo EM (MCEM) algorithm, with parameter expansion to complete the M-step, to avoid the direct evaluation of the intractable multivariate normal orthant probabilities. The parameter expansion not only enables a closed-form solution in the M-step but also improves efficiency. Using the simulation studies, we compare the performance of our approach with the MCEM algorithms developed by Chib and Greenberg (1998) and Song and Lee (2005), as well as the iterative approach proposed by Li and Schafer (2008). Our approach is further illustrated using a real-world example.

  9. Likelihood Analysis of Multivariate Probit Models Using a Parameter Expanded MCEM Algorithm

    Science.gov (United States)

    Xu, Huiping; Craig, Bruce A.

    2010-01-01

    Multivariate binary data arise in a variety of settings. In this paper, we propose a practical and efficient computational framework for maximum likelihood estimation of multivariate probit regression models. This approach uses the Monte Carlo EM (MCEM) algorithm, with parameter expansion to complete the M-step, to avoid the direct evaluation of the intractable multivariate normal orthant probabilities. The parameter expansion not only enables a closed-form solution in the M-step but also improves efficiency. Using the simulation studies, we compare the performance of our approach with the MCEM algorithms developed by Chib and Greenberg (1998) and Song and Lee (2005), as well as the iterative approach proposed by Li and Schafer (2008). Our approach is further illustrated using a real-world example. PMID:21042430

  10. Expanding uses of building information modeling in life-cycle construction projects.

    Science.gov (United States)

    Hannele, Kerosuo; Reijo, Miettinen; Tarja, Mäki; Sami, Paavola; Jenni, Korpela; Teija, Rantala

    2012-01-01

    BIM is targeted at providing information about the entire building and a complete set of design documents and data stored in an integrated database. In this paper, we study the use of BIM in two life-cycle construction projects in Kuopio, Finland during 2011. The analysis of uses of BIM and their main problems will constitute a foundation for an intervention. We will focus on the following questions: (1) How different partners use the composite BIM model? (2) What are the major contradictions or problems in the BIM use? The preliminary findings reported in this study show that BIM has been adopted quite generally to design use but the old ways of collaboration seem to prevail, especially between designers and between designers and building sites. BIM has provided new means and demands for collaboration but expansive uses of BIM for providing new interactive processes across professional fields have not much come true.

  11. Software-defined networking model for smart transformers with ISO/IEC/IEEE 21451 sensors

    Directory of Open Access Journals (Sweden)

    Longhua Guo

    2017-06-01

    Full Text Available The advanced IEC 61850 smart transformer has shown an improved performance in monitoring, controlling, and protecting the equipment in smart substations. However, heterogeneity, feasibility, and network control problems have limited the smart transformer’s performance in networks. To address these issues, a software-defined networking model was proposed using ISO/IEC/IEEE 21451 networks. An IEC-61850-based network controller was designed as a new kind of intelligent electrical device (IED. The proposed data and information models enhanced the network awareness ability and facilitated the access of smart sensors in transformer to communication networks. The performance evaluation results showed an improved efficiency.

  12. New software library of geometrical primitives for modelling of solids used in Monte Carlo detector simulations

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    We present our effort for the creation of a new software library of geometrical primitives, which are used for solid modelling in Monte Carlo detector simulations. We plan to replace and unify current geometrical primitive classes in the CERN software projects Geant4 and ROOT with this library. Each solid is represented by a C++ class with methods suited for measuring distances of particles from the surface of a solid and for determination as to whether the particles are located inside, outside or on the surface of the solid. We use numerical tolerance for determining whether the particles are located on the surface. The class methods also contain basic support for visualization. We use dedicated test suites for validation of the shape codes. These include also special performance and numerical value comparison tests for help with analysis of possible candidates of class methods as well as to verify that our new implementation proposals were designed and implemented properly. Currently, bridge classes are u...

  13. NASAL-Geom, a free upper respiratory tract 3D model reconstruction software

    Science.gov (United States)

    Cercos-Pita, J. L.; Cal, I. R.; Duque, D.; de Moreta, G. Sanjuán

    2018-02-01

    The tool NASAL-Geom, a free upper respiratory tract 3D model reconstruction software, is here described. As a free software, researchers and professionals are welcome to obtain, analyze, improve and redistribute it, potentially increasing the rate of development, and reducing at the same time ethical conflicts regarding medical applications which cannot be analyzed. Additionally, the tool has been optimized for the specific task of reading upper respiratory tract Computerized Tomography scans, and producing 3D geometries. The reconstruction process is divided into three stages: preprocessing (including Metal Artifact Reduction, noise removal, and feature enhancement), segmentation (where the nasal cavity is identified), and 3D geometry reconstruction. The tool has been automatized (i.e. no human intervention is required) a critical feature to avoid bias in the reconstructed geometries. The applied methodology is discussed, as well as the program robustness and precision.

  14. Open Source Software for Mapping Human Impacts on Marine Ecosystems with an Additive Model

    Directory of Open Access Journals (Sweden)

    Andy Stock

    2016-06-01

    Full Text Available This paper describes an easy-to-use open source software tool implementing a commonly used additive model (Halpern et al., 'Science', 2008 for mapping human impacts on marine ecosystems. The tool has been used to map the potential for cumulative human impacts in Arctic marine waters and can support future human impact mapping projects by 1 making the model easier to use; 2 making updates of model results straightforward when better input data become available; 3 storing input data and information about processing steps in a defined format and thus facilitating data sharing and reproduction of modeling results; 4 supporting basic visualization of model inputs and outputs without the need for advanced technical skills. The tool, called EcoImpactMapper, was implemented in Java and is thus platform-independent. A tutorial, example data, the tool and the source code are available online.

  15. The software architecture of climate models: a graphical comparison of CMIP5 and EMICAR5 configurations

    Science.gov (United States)

    Alexander, K.; Easterbrook, S. M.

    2015-04-01

    We analyze the source code of eight coupled climate models, selected from those that participated in the CMIP5 (Taylor et al., 2012) or EMICAR5 (Eby et al., 2013; Zickfeld et al., 2013) intercomparison projects. For each model, we sort the preprocessed code into components and subcomponents based on dependency structure. We then create software architecture diagrams that show the relative sizes of these components/subcomponents and the flow of data between them. The diagrams also illustrate several major classes of climate model design; the distribution of complexity between components, which depends on historical development paths as well as the conscious goals of each institution; and the sharing of components between different modeling groups. These diagrams offer insights into the similarities and differences in structure between climate models, and have the potential to be useful tools for communication between scientists, scientific institutions, and the public.

  16. Experimental processing of a model data set using Geobit seismic software

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Sang Yong [Korea Inst. of Geology Mining and Materials, Taejon (Korea, Republic of)

    1995-12-01

    A seismic data processing software, Geobit, has been developed and is continuously updated to implement newer processing techniques and to support more hardware platforms. Geobit is intended to support all Unix platforms ranging from PC to CRAY. The current version supports two platform, i.e., PC/Linux and Sun Sparc based Sun OS 4.1.x. PC/Linux attracted geophysicists in some universities trying to install Geobit in their laboratories to be used as their research tool. However, one of the problem is the difficulty in getting the seismic data. The primary reason is its huge volume. The field data is too bulky to fit their relatively small storage media, such as PC disk. To solve the problem, KIGAM released a model seismic data set via ftp.kigam.re.kr. This study aims two purposes. The first one is testing Geobit software for its suitability in seismic data processing. The test includes reproducing the model through the seismic data processing. If it fails to reproduce the original model, the software is considered buggy and incomplete. However, if it can successfully reproduce the input model, I would be proud of what I have accomplished for the last few years in writing Geobit. The second purpose is to give a guide on Geobit usage by providing an example set of job files needed to process a given data. This example will help scientists lacking Geobit experience to concentrate on their study more easily. Once they know the Geobit processing technique, and later on Geobit programming, they can implement their own processing idea, contributing newer technologies to Geobit. The complete Geobit job files needed to process the model data is written, in the following job sequence: (1) data loading, (2) CDP sort, (3) decon analysis, (4) velocity analysis, (5) decon verification, (6) stack, (7) filter analysis, (8) filtered stack, (9) time migration, (10) depth migration. The control variables in the job files are discussed. (author). 10 figs., 1 tab.

  17. Structural informatics, modeling, and design with an open-source Molecular Software Library (MSL).

    Science.gov (United States)

    Kulp, Daniel W; Subramaniam, Sabareesh; Donald, Jason E; Hannigan, Brett T; Mueller, Benjamin K; Grigoryan, Gevorg; Senes, Alessandro

    2012-07-30

    We present the Molecular Software Library (MSL), a C++ library for molecular modeling. MSL is a set of tools that supports a large variety of algorithms for the design, modeling, and analysis of macromolecules. Among the main features supported by the library are methods for applying geometric transformations and alignments, the implementation of a rich set of energy functions, side chain optimization, backbone manipulation, calculation of solvent accessible surface area, and other tools. MSL has a number of unique features, such as the ability of storing alternative atomic coordinates (for modeling) and multiple amino acid identities at the same backbone position (for design). It has a straightforward mechanism for extending its energy functions and can work with any type of molecules. Although the code base is large, MSL was created with ease of developing in mind. It allows the rapid implementation of simple tasks while fully supporting the creation of complex applications. Some of the potentialities of the software are demonstrated here with examples that show how to program complex and essential modeling tasks with few lines of code. MSL is an ongoing and evolving project, with new features and improvements being introduced regularly, but it is mature and suitable for production and has been used in numerous protein modeling and design projects. MSL is open-source software, freely downloadable at http://msl-libraries.org. We propose it as a common platform for the development of new molecular algorithms and to promote the distribution, sharing, and reutilization of computational methods. Copyright © 2012 Wiley Periodicals, Inc.

  18. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  19. Designing Spatial Visualisation Tasks for Middle School Students with a 3D Modelling Software: An Instrumental Approach

    Science.gov (United States)

    Turgut, Melih; Uygan, Candas

    2015-01-01

    In this work, certain task designs to enhance middle school students' spatial visualisation ability, in the context of an instrumental approach, have been developed. 3D modelling software, SketchUp®, was used. In the design process, software tools were focused on and, thereafter, the aim was to interpret the instrumental genesis and spatial…

  20. Experiences with Formal Engineering: Model-Based Specification, Implementation and Testing of a Software Bus at Neopost

    NARCIS (Netherlands)

    Sijtema, Marten; Stoelinga, Mariëlle Ida Antoinette; Belinfante, Axel; Marinelli, Lawrence; Salaün, Gwen; Schätz, Bernhard

    We report on the actual industrial use of formal methods during the development of a software bus. At Neopost Inc., we developed the server component of a software bus, called the XBus, using formal methods during the design, validation and testing phase: We modeled our design of the XBus in the

  1. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  2. Experiences with formal engineering: model-based specification, implementation and testing of a software bus at Neopost

    NARCIS (Netherlands)

    Sijtema, M.; Salaün, G.; Schätz, B.; Belinfante, Axel; Stoelinga, Mariëlle Ida Antoinette; Marinelli, L.

    2014-01-01

    We report on the actual industrial use of formal methods during the development of a software bus. During an internship at Neopost Inc., of 14 weeks, we developed the server component of a software bus, called the XBus, using formal methods during the design, validation and testing phase: we modeled

  3. Time-constrained mother and expanding market: emerging model of under-nutrition in India.

    Science.gov (United States)

    Chaturvedi, S; Ramji, S; Arora, N K; Rewal, S; Dasgupta, R; Deshmukh, V

    2016-07-25

    Persistent high levels of under-nutrition in India despite economic growth continue to challenge political leadership and policy makers at the highest level. The present inductive enquiry was conducted to map the perceptions of mothers and other key stakeholders, to identify emerging drivers of childhood under-nutrition. We conducted a multi-centric qualitative investigation in six empowered action group states of India. The study sample included 509 in-depth interviews with mothers of undernourished and normal nourished children, policy makers, district level managers, implementer and facilitators. Sixty six focus group discussions and 72 non-formal interactions were conducted in two rounds with primary caretakers of undernourished children, Anganwadi Workers and Auxiliary Nurse Midwives. Based on the perceptions of the mothers and other key stakeholders, a model evolved inductively showing core themes as drivers of under-nutrition. The most forceful emerging themes were: multitasking, time constrained mother with dwindling family support; fragile food security or seasonal food paucity; child targeted market with wide availability and consumption of ready-to-eat market food items; rising non-food expenditure, in the context of rising food prices; inadequate and inappropriate feeding; delayed recognition of under-nutrition and delayed care seeking; and inadequate responsiveness of health care system and Integrated Child Development Services (ICDS). The study emphasized that the persistence of child malnutrition in India is also tied closely to the high workload and consequent time constraint of mothers who are increasingly pursuing income generating activities and enrolled in paid labour force, without robust institutional support for childcare. The emerging framework needs to be further tested through mixed and multiple method research approaches to quantify the contribution of time limitation of the mother on the current burden of child under-nutrition.

  4. Time-constrained mother and expanding market: emerging model of under-nutrition in India

    Directory of Open Access Journals (Sweden)

    S. Chaturvedi

    2016-07-01

    Full Text Available Abstract Background Persistent high levels of under-nutrition in India despite economic growth continue to challenge political leadership and policy makers at the highest level. The present inductive enquiry was conducted to map the perceptions of mothers and other key stakeholders, to identify emerging drivers of childhood under-nutrition. Methods We conducted a multi-centric qualitative investigation in six empowered action group states of India. The study sample included 509 in-depth interviews with mothers of undernourished and normal nourished children, policy makers, district level managers, implementer and facilitators. Sixty six focus group discussions and 72 non-formal interactions were conducted in two rounds with primary caretakers of undernourished children, Anganwadi Workers and Auxiliary Nurse Midwives. Results Based on the perceptions of the mothers and other key stakeholders, a model evolved inductively showing core themes as drivers of under-nutrition. The most forceful emerging themes were: multitasking, time constrained mother with dwindling family support; fragile food security or seasonal food paucity; child targeted market with wide availability and consumption of ready-to-eat market food items; rising non-food expenditure, in the context of rising food prices; inadequate and inappropriate feeding; delayed recognition of under-nutrition and delayed care seeking; and inadequate responsiveness of health care system and Integrated Child Development Services (ICDS. The study emphasized that the persistence of child malnutrition in India is also tied closely to the high workload and consequent time constraint of mothers who are increasingly pursuing income generating activities and enrolled in paid labour force, without robust institutional support for childcare. Conclusion The emerging framework needs to be further tested through mixed and multiple method research approaches to quantify the contribution of time limitation of

  5. Application of the Software as a Service Model to the Control of Complex Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Stadler, Michael; Donadee, Jonathan; Marnay, Chris; Mendes, Goncalo; Appen, Jan von; Megel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-17

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analysed.

  6. Application of the Software as a Service Model to the Control of Complex Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Stadler, Michael; Donadee, Jon; Marnay, Chris; Lai, Judy; Mendes, Goncalo; Appen, Jan von; M& #233; gel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-18

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analyzed.

  7. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    Science.gov (United States)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  8. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  9. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    Science.gov (United States)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  10. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  11. A novel opinion dynamics model based on expanded observation ranges and individuals’ social influences in social networks

    Science.gov (United States)

    Diao, Su-Meng; Liu, Yun; Zeng, Qing-An; Luo, Gui-Xun; Xiong, Fei

    2014-12-01

    In this paper, we propose an opinion dynamics model in order to investigate opinion evolution and interactions and the behavior of individuals. By introducing social influence and its feedback mechanism, the proposed model can highlight the heterogeneity of individuals and reproduce realistic online opinion interactions. It can also expand the observation range of affected individuals. Combining psychological studies on the social impact of majorities and minorities, affected individuals update their opinions by balancing social impact from both supporters and opponents. It can be seen that complete consensus is not always obtained. When the initial density of either side is greater than 0.8, the enormous imbalance leads to complete consensus. Otherwise, opinion clusters consisting of a set of tightly connected individuals who hold similar opinions appear. Moreover, a tradeoff is discovered between high interaction intensity and low stability with regard to observation ranges. The intensity of each interaction is negatively correlated with observation range, while the stability of each individual’s opinion positively affects the correlation. Furthermore, the proposed model presents the power-law properties in the distribution of individuals’ social influences, which is in agreement with people’s daily cognition. Additionally, it is proven that the initial distribution of individuals’ social influences has little effect on the evolution.

  12. Common Practices from Two Decades of Water Resources Modelling Published in Environmental Modelling & Software: 1997 to 2016

    Science.gov (United States)

    Ames, D. P.; Peterson, M.; Larsen, J.

    2016-12-01

    A steady flow of manuscripts describing integrated water resources management (IWRM) modelling has been published in Environmental Modelling & Software since the journal's inaugural issue in 1997. These papers represent two decades of peer-reviewed scientific knowledge regarding methods, practices, and protocols for conducting IWRM. We have undertaken to explore this specific assemblage of literature with the intention of identifying commonly reported procedures in terms of data integration methods, modelling techniques, approaches to stakeholder participation, means of communication of model results, and other elements of the model development and application life cycle. Initial results from this effort will be presented including a summary of commonly used practices, and their evolution over the past two decades. We anticipate that results will show a pattern of movement toward greater use of both stakeholder/participatory modelling methods as well as increased use of automated methods for data integration and model preparation. Interestingly, such results could be interpreted to show that the availability of better, faster, and more integrated software tools and technologies free the modeler to take a less technocratic and more human approach to water resources modelling.

  13. Design and Implementation of Scientific Software Components to Enable Multiscale Modeling: The Effective Fragment Potential (QM/EFP) Method

    Energy Technology Data Exchange (ETDEWEB)

    Gaenko, Alexander [Ames Laboratory; Windus, Theresa L. [Ames Laboratory; Sosonkina, Masha [Ames Laboratory; Gordon, Mark S. [Ames Laboratory

    2012-10-19

    The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecular systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.

  14. E-COCOMO: The Extended COst Constructive MOdel for Cleanroom Software Engineering

    Directory of Open Access Journals (Sweden)

    Hitesh KUMAR SHARMA

    2014-02-01

    Full Text Available Mistakes create rework. Rework takes time and increases costs. The traditional software engineering methodology defines the ratio of Design:Code:Test as 40:20:40. As we can easily see that 40% time and efforts are used in testing phase in traditional approach, that means we have to perform rework again if we found some bugs in testing phase. This rework is being performed after Design and code phase. This rework will increase the cost exponentially. The cleanroom software engineering methodology controls the exponential growth in cost by removing this rework. It says that "do the work correct in first attempt and move to next phase after getting the proof of correctness". This new approach minimized the rework and reduces the cost in the exponential ratio. Due to the removal of testing phase, the COCOMO (COst COnstructive MOdel used for the traditional engineering is not directly applicable in cleanroom software engineering. The traditional cost drivers used for traditional COCOMO needs to be revised. We have proposed the Extended version of COCOMO (i.e. E-COCOMO in which we have incorporated some new cost drivers. This paper explains the proposed E-COCOMO and the detailed description of proposed new cost driver.

  15. Surgical model-view-controller simulation software framework for local and collaborative applications.

    Science.gov (United States)

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  16. Comparison of Antenna Array Systems Using OFDM for Software Radio via the SIBIC Model

    Directory of Open Access Journals (Sweden)

    Robert D. Palmer

    2005-09-01

    Full Text Available This paper investigates the performance of two candidates for software radio WLAN, reconfigurable OFDM modulation and antenna diversity, in an indoor environment. The scenario considered is a 20 m×10 m×3 m room with two base units and one mobile unit. The two base units use omnidirectional antennas to transmit and the mobile unit uses either a single antenna with equalizer, a fixed beamformer with equalizer, or an adaptive beamformer with equalizer to receive. The modulation constellation of the data is QPSK and 16-QAM. The response of the channel at the mobile unit is simulated using a three-dimensional indoor WLAN propagation model that generates multipath components with realistic spatial and temporal correlation. An underlying assumption of the scenario is that existing antenna hardware is available and could be exploited if software processing resources are allocated. The results of the simulations indicate that schemes using more resources outperform simpler schemes in most cases. This implies that desired user performance could be used to dynamically assign software processing resources to the demands of a particular indoor WLAN channel if such resources are available.

  17. An Efficient Technique for Bayesian Modelling of Family Data Using the BUGS software

    Directory of Open Access Journals (Sweden)

    Harold T Bae

    2014-11-01

    Full Text Available Linear mixed models have become a popular tool to analyze continuous data from family-based designs by using random effects that model the correlation of subjects from the same family. However, mixed models for family data are challenging to implement with the BUGS (Bayesian inference Using Gibbs Sampling software because of the high-dimensional covariance matrix of the random effects. This paper describes an efficient parameterization that utilizes the singular value decomposition of the covariance matrix of random effects, includes the BUGS code for such implementation, and extends the parameterization to generalized linear mixed models. The implementation is evaluated using simulated data and an example from a large family-based study is presented with a comparison to other existing methods.

  18. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  19. Model-based software for simulating ultrasonic pulse/echo inspections of metal components

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; Taylor, Jared L.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.; Barnard, Daniel J.

    2017-02-01

    Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at Iowa State University, an effort was initiated in 2015 to repackage existing research-grade software into user friendly tools for the rapid estimation of signal-to-noise ratio (S/N) for ultrasonic inspections of metals. The software combines: (1) a Python-based graphical user interface for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signals and backscattered grain noise characteristics. The later makes use the Thompson-Gray Model for the response from an internal defect and the Independent Scatterer Model for backscattered grain noise. This paper provides an overview of the ongoing modeling effort with emphasis on recent developments. These include: treatment of angle-beam inspections, implementation of distance-amplitude corrections, changes in the generation of "invented" calibration signals, efforts to simulate ultrasonic C-scans; and experimental testing of model predictions. The simulation software can now treat both normal and oblique-incidence immersion inspections of curved metal components having equiaxed microstructures in which the grain size varies with depth. Both longitudinal and shear-wave inspections are treated. The model transducer can either be planar, spherically-focused, or bi-cylindrically-focused. A calibration (or reference) signal is required and is used to deduce the measurement system efficiency function. This can be "invented" by the software using center frequency and bandwidth information specified by the user, or, alternatively, a measured calibration signal can be used. Defect types include flat-bottomed-hole reference reflectors, and spherical pores and inclusions. Simulation outputs include estimated defect signal amplitudes, root-mean-squared grain noise amplitudes, and S/N as functions of the depth of the defect within the metal component. At any particular

  20. On the Characterization and Software Implementation of General Protein Lattice Models

    Science.gov (United States)

    Bechini, Alessio

    2013-01-01

    Abstract models of proteins have been widely used as a practical means to computationally investigate general properties of the system. In lattice models any sterically feasible conformation is represented as a self-avoiding walk on a lattice, and residue types are limited in number. So far, only two- or three-dimensional lattices have been used. The inspection of the neighborhood of alpha carbons in the core of real proteins reveals that also lattices with higher coordination numbers, possibly in higher dimensional spaces, can be adopted. In this paper, a new general parametric lattice model for simplified protein conformations is proposed and investigated. It is shown how the supporting software can be consistently designed to let algorithms that operate on protein structures be implemented in a lattice-agnostic way. The necessary theoretical foundations are developed and organically presented, pinpointing the role of the concept of main directions in lattice-agnostic model handling. Subsequently, the model features across dimensions and lattice types are explored in tests performed on benchmark protein sequences, using a Python implementation. Simulations give insights on the use of square and triangular lattices in a range of dimensions. The trend of potential minimum for sequences of different lengths, varying the lattice dimension, is uncovered. Moreover, an extensive quantitative characterization of the usage of the so-called “move types” is reported for the first time. The proposed general framework for the development of lattice models is simple yet complete, and an object-oriented architecture can be proficiently employed for the supporting software, by designing ad-hoc classes. The proposed framework represents a new general viewpoint that potentially subsumes a number of solutions previously studied. The adoption of the described model pushes to look at protein structure issues from a more general and essential perspective, making computational