WorldWideScience

Sample records for modeling cem software

  1. SpaCEM3: a software for biological module detection when data is incomplete, high dimensional and dependent.

    Science.gov (United States)

    Vignes, Matthieu; Blanchet, Juliette; Leroux, Damien; Forbes, Florence

    2011-03-15

    Among classical methods for module detection, SpaCEM(3) provides ad hoc algorithms that were shown to be particularly well adapted to specific features of biological data: high-dimensionality, interactions between components (genes) and integrated treatment of missingness in observations. The software, currently in its version 2.0, is developed in C++ and can be used either via command line or with the GUI under Linux and Windows environments. The SpaCEM(3) software, a documentation and datasets are available from http://spacem3.gforge.inria.fr/.

  2. ROMANCE: A new software tool to improve data robustness and feature identification in CE-MS metabolomics.

    Science.gov (United States)

    González-Ruiz, Víctor; Gagnebin, Yoric; Drouin, Nicolas; Codesido, Santiago; Rudaz, Serge; Schappler, Julie

    2018-01-02

    The use of capillary electrophoresis coupled to mass spectrometry (CE-MS) in metabolomics remains an oddity compared to the widely adopted use of liquid chromatography. This technique is traditionally regarded as lacking the reproducibility to adequately identify metabolites by their migration times. The major reason is the variability of the velocity of the background electrolyte, mainly coming from shifts in the magnitude of the electroosmotic flow and from the suction caused by electrospray interfaces. The use of the effective electrophoretic mobility is one solution to overcome this issue as it is a characteristic feature of each compound. To date, such an approach has not been applied to metabolomics due to the complexity and size of CE-MS data obtained in such studies. In this paper, ROMANCE (RObust Metabolomic Analysis with Normalized CE) is introduced as a new software for CE-MS-based metabolomics. It allows the automated conversion of batches of CE-MS files with minimal user intervention. ROMANCE converts the x-axis of each MS file from the time into the effective mobility scale and the resulting files are already pseudo-aligned, present normalized peak areas and improved reproducibility, and can eventually follow existing metabolomic workflows. The software was developed in Scala, so it is multi-platform and computationally-efficient. It is available for download under a CC license. In this work, the versatility of ROMANCE was demonstrated by using data obtained in the same and in different laboratories, as well as its application to the analysis of human plasma samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Transformation of standardized clinical models based on OWL technologies: from CEM to OpenEHR archetypes.

    Science.gov (United States)

    Legaz-García, María del Carmen; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Chute, Christopher G; Tao, Cui

    2015-05-01

    The semantic interoperability of electronic healthcare records (EHRs) systems is a major challenge in the medical informatics area. International initiatives pursue the use of semantically interoperable clinical models, and ontologies have frequently been used in semantic interoperability efforts. The objective of this paper is to propose a generic, ontology-based, flexible approach for supporting the automatic transformation of clinical models, which is illustrated for the transformation of Clinical Element Models (CEMs) into openEHR archetypes. Our transformation method exploits the fact that the information models of the most relevant EHR specifications are available in the Web Ontology Language (OWL). The transformation approach is based on defining mappings between those ontological structures. We propose a way in which CEM entities can be transformed into openEHR by using transformation templates and OWL as common representation formalism. The transformation architecture exploits the reasoning and inferencing capabilities of OWL technologies. We have devised a generic, flexible approach for the transformation of clinical models, implemented for the unidirectional transformation from CEM to openEHR, a series of reusable transformation templates, a proof-of-concept implementation, and a set of openEHR archetypes that validate the methodological approach. We have been able to transform CEM into archetypes in an automatic, flexible, reusable transformation approach that could be extended to other clinical model specifications. We exploit the potential of OWL technologies for supporting the transformation process. We believe that our approach could be useful for international efforts in the area of semantic interoperability of EHR systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. CEM reporting workstation

    Energy Technology Data Exchange (ETDEWEB)

    Caulfield, C. [Electric Software Products, Los Altos, CA (United States); Dene, C. [Electric Power Research Inst., Palo Alto, CA (United States)

    1995-12-31

    The purpose of this report is to describe the Continuous Emissions Monitoring (CEM) Reporting Workstation. The authors explore the uses for the CEM Reporting Workstation, look at the technical challenges, summarize the CEM Reporting Workstation solutions and describe the CEM Reporting Workstation architecture and development aspects. The Electric Power Research Institute (EPRI) was formed to apply advanced science and technology to the benefit of member utilities and their customers. Funded through annual membership dues from some 660 member utilities, EPRI`s work covers a wide range of technologies related to the generation, delivery, and use of electricity, with special attention paid to cost-effectiveness and environmental concerns. Texas Utilities and Pennsylvania Electric approached EPRI to look at their concerns about reporting emissions data to the EPA and making the emissions information available to many people within the utility. EPRI contracted Electric Software Products to research the utility market to find out specific needs of the utilities and to develop the CEM Reporting Workstation.

  5. Evolution of the argillite / CEM I interface at 70 C.: in situ tests and modelling results

    International Nuclear Information System (INIS)

    Lalan, P.; Dauzeres, A.; Barker, E.; De Windt, L.; Detilleux, V.; Desveaux, P.

    2015-01-01

    French radioactive waste disposal concept involves cementitious materials in a clayey host-rock. The presence of exothermic wastes in the storage cells may induce a temperature of about 70 Celsius degrees at the material interfaces. At present, experiment thermal conditions have been undertaken at about 20 C. degrees and studies at higher temperature are really scarce, especially experiments considering diffusion through the cement / clay interface. The still on-going study presented here is focusing on argillite / CEM-I interface. A one-year experiment under in situ conditions at the Tournemire experimental station (IRSN) was carried out and meanwhile, preliminary reactive transport modelling with HYTEC helped to understand the impact of a high temperature on the physico-chemical behaviour of cement / clay interface. The first results showed decalcification of cement and diffuse carbonation as well as a possible illite precipitation of clay-type phases. A C-S-H ribbon appeared at the interface between the two materials and a layer grew between the C-S-H ribbon and the cementitious material. This layer contained zeolites and behaved as a diffusive barrier. After one year of in situ interactions, the disturbance thickness was about 350 microns in CEM-I cement paste and about 100 microns in argillite. The modelling reproduced relatively well the experimentally observed processes but the extension of the disturbance is too wide and the zeolite layer is misplaced according to the experimental observations. This study highlights the lack of data at highest temperature on the reaction kinetics, diffusion coefficients but also on porosity variations. (authors)

  6. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment... levels Use the following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1...

  7. Buttressing staples with cholecyst-derived extracellular matrix (CEM) reinforces staple lines in an ex vivo peristaltic inflation model.

    LENUS (Irish Health Repository)

    Burugapalli, Krishna

    2008-11-01

    Staple line leakage and bleeding are the most common problems associated with the use of surgical staplers for gastrointestinal resection and anastomotic procedures. These complications can be reduced by reinforcing the staple lines with buttressing materials. The current study reports the potential use of cholecyst-derived extracellular matrix (CEM) in non-crosslinked (NCEM) and crosslinked (XCEM) forms, and compares their mechanical performance with clinically available buttress materials [small intestinal submucosa (SIS) and bovine pericardium (BP)] in an ex vivo small intestine model.

  8. Mercury CEM Calibration

    Energy Technology Data Exchange (ETDEWEB)

    John Schabron; Joseph Rovani; Mark Sanderson

    2008-02-29

    Mercury continuous emissions monitoring systems (CEMS) are being implemented in over 800 coal-fired power plant stacks. The power industry desires to conduct at least a full year of monitoring before the formal monitoring and reporting requirement begins on January 1, 2009. It is important for the industry to have available reliable, turnkey equipment from CEM vendors. Western Research Institute (WRI) is working closely with the Electric Power Research Institute (EPRI), the National Institute of Standards and Technology (NIST), and the Environmental Protection Agency (EPA) to facilitate the development of the experimental criteria for a NIST traceability protocol for dynamic elemental mercury vapor generators. The generators are used to calibrate mercury CEMs at power plant sites. The Clean Air Mercury Rule (CAMR) which was published in the Federal Register on May 18, 2005 requires that calibration be performed with NIST-traceable standards (Federal Register 2007). Traceability procedures will be defined by EPA. An initial draft traceability protocol was issued by EPA in May 2007 for comment. In August 2007, EPA issued an interim traceability protocol for elemental mercury generators (EPA 2007). The protocol is based on the actual analysis of the output of each calibration unit at several concentration levels ranging initially from about 2-40 {micro}g/m{sup 3} elemental mercury, and in the future down to 0.2 {micro}g/m{sup 3}, and this analysis will be directly traceable to analyses by NIST. The document is divided into two separate sections. The first deals with the qualification of generators by the vendors for use in mercury CEM calibration. The second describes the procedure that the vendors must use to certify the generator models that meet the qualification specifications. The NIST traceable certification is performance based, traceable to analysis using isotope dilution inductively coupled plasma/mass spectrometry performed by NIST in Gaithersburg, MD. The

  9. Using fuzzy models to migrate from customer relationship management (CRM) to customer experience management (CEM)

    OpenAIRE

    Dr. Anna Maria Gil-Lafuente; Carolina Luis-Bassa

    2011-01-01

    Relationship Marketing has made rapid progress during the last ten years. Since the development of the customer-centric model, reinforced by the emergence of CRM (Customer Relationship Management) strategies, companies have focused on finding models and tools that allow them to get to know better their clients. The management of customer relationship with the company has evolved from seeking the customer satisfaction to seek customer loyalty, and later on to create a brand advocate consumer f...

  10. 40 CFR Table 7 to Subpart Bbbb of... - Model Rule-Requirements for Continuous Emission Monitoring Systems (CEMS)

    Science.gov (United States)

    2010-07-01

    ... sulfur dioxide emissions of the municipal waste combustion unit 4. Carbon Monoxide 125 percent of the... Emission Monitoring Systems (CEMS) 7 Table 7 to Subpart BBBB of Part 60 Protection of Environment... emissions of the municipal waste combustion unit P.S. 2 Method 7E. 3. Sulfur Dioxide Inlet to control device...

  11. Metabolomic assessment with CE-MS of the nutraceutical effect of Cystoseira spp extracts in an animal model.

    Science.gov (United States)

    Moraes, Edgar P; Rupérez, Francisco Javier; Plaza, Merichel; Herrero, Miguel; Barbas, Coral

    2011-08-01

    There is a need of scientific evidence of claimed nutraceutical effects, but also there is a social movement towards the use of natural products and among them algae are seen as rich resources. Within this scenario, the development of methodology for rapid and reliable assessment of markers of efficiency and security of these extracts is necessary. The rat treated with streptozotocin has been proposed as the most appropriate model of systemic oxidative stress for studying antioxidant therapies. Cystoseira is a brown alga containing fucoxanthin and other carothenes whose pressure-assisted extracts were assayed to discover a possible beneficial effect on complications related to diabetes evolution in an acute but short-term model. Urine was selected as the sample and CE-TOF-MS as the analytical technique to obtain the fingerprints in a non-target metabolomic approach. Multivariate data analysis revealed a good clustering of the groups and permitted the putative assignment of compounds statistically significant in the classification. Interestingly a group of compounds associated to lysine glycation and cleavage from proteins was found to be increased in diabetic animals receiving vehicle as compared to control animals receiving vehicle (N6,N6,N6-trimethyl-L-lysine, N-methylnicotinamide, galactosylhydroxylysine, L-carnitine, N6-acetyl-N6-hydroxylysine, fructose-lysine, pipecolic acid, urocanic acid, amino-isobutanoate, formylisoglutamine. Fructoselysine significantly decreased after the treatment changing from a 24% increase to a 19% decrease. CE-MS fingerprinting of urine has provided a group of compounds different to those detected with other techniques and therefore proves the necessity of a cross-platform analysis to obtain a broad view of biological samples. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  13. Software Frameworks for Model Composition

    Directory of Open Access Journals (Sweden)

    Mikel D. Petty

    2014-01-01

    Full Text Available A software framework is an architecture or infrastructure intended to enable the integration and interoperation of software components. Specialized types of software frameworks are those specifically intended to support the composition of models or other components within a simulation system. Such frameworks are intended to simplify the process of assembling a complex model or simulation system from simpler component models as well as to promote the reuse of the component models. Several different types of software frameworks for model composition have been designed and implemented; those types include common library, product line architecture, interoperability protocol, object model, formal, and integrative environment. The various framework types have different components, processes for composing models, and intended applications. In this survey the fundamental terms and concepts of software frameworks for model composition are presented, the different types of such frameworks are explained and compared, and important examples of each type are described.

  14. Modeling software systems by domains

    Science.gov (United States)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  15. Energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-01-01

    Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...

  16. The art of software modeling

    CERN Document Server

    Lieberman, Benjamin A

    2007-01-01

    Modeling complex systems is a difficult challenge and all too often one in which modelers are left to their own devices. Using a multidisciplinary approach, The Art of Software Modeling covers theory, practice, and presentation in detail. It focuses on the importance of model creation and demonstrates how to create meaningful models. Presenting three self-contained sections, the text examines the background of modeling and frameworks for organizing information. It identifies techniques for researching and capturing client and system information and addresses the challenges of presenting models to specific audiences. Using concepts from art theory and aesthetics, this broad-based approach encompasses software practices, cognitive science, and information presentation. The book also looks at perception and cognition of diagrams, view composition, color theory, and presentation techniques. Providing practical methods for investigating and organizing complex information, The Art of Software Modeling demonstrate...

  17. Mercury CEM Calibration

    Energy Technology Data Exchange (ETDEWEB)

    John F. Schabron; Joseph F. Rovani; Susan S. Sorini

    2007-03-31

    The Clean Air Mercury Rule (CAMR) which was published in the Federal Register on May 18, 2005, requires that calibration of mercury continuous emissions monitors (CEMs) be performed with NIST-traceable standards. Western Research Institute (WRI) is working closely with the Electric Power Research Institute (EPRI), the National Institute of Standards and Technology (NIST), and the Environmental Protection Agency (EPA) to facilitate the development of the experimental criteria for a NIST traceability protocol for dynamic elemental mercury vapor generators. The traceability protocol will be written by EPA. Traceability will be based on the actual analysis of the output of each calibration unit at several concentration levels ranging from about 2-40 ug/m{sup 3}, and this analysis will be directly traceable to analyses by NIST using isotope dilution inductively coupled plasma/mass spectrometry (ID ICP/MS) through a chain of analyses linking the calibration unit in the power plant to the NIST ID ICP/MS. Prior to this project, NIST did not provide a recommended mercury vapor pressure equation or list mercury vapor pressure in its vapor pressure database. The NIST Physical and Chemical Properties Division in Boulder, Colorado was subcontracted under this project to study the issue in detail and to recommend a mercury vapor pressure equation that the vendors of mercury vapor pressure calibration units can use to calculate the elemental mercury vapor concentration in an equilibrium chamber at a particular temperature. As part of this study, a preliminary evaluation of calibration units from five vendors was made. The work was performed by NIST in Gaithersburg, MD and Joe Rovani from WRI who traveled to NIST as a Visiting Scientist.

  18. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  19. A Quantitative Software Risk Assessment Model

    Science.gov (United States)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  20. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  1. DFI Computer Modeling Software (CMS)

    Energy Technology Data Exchange (ETDEWEB)

    Cazalet, E.G.; Deziel, L.B. Jr.; Haas, S.M.; Martin, T.W.; Nesbitt, D.M.; Phillips, R.L.

    1979-10-01

    The data base management system used to create, edit and store models data and solutions for the LEAP system is described. The software is entirely in FORTRAN-G for the IBM 370 series of computers and provides interface with a commercial data base system SYSTEM-2000.

  2. Motorola Secure Software Development Model

    Directory of Open Access Journals (Sweden)

    Francis Mahendran

    2008-08-01

    Full Text Available In today's world, the key to meeting the demand for improved security is to implement repeatable processes that reliably deliver measurably improved security. While many organizations have announced efforts to institutionalize a secure software development process, there is little or no industry acceptance for a common process improvement framework for secure software development. Motorola has taken the initiative to develop such a framework, and plans to share this with the Software Engineering Institute for possible inclusion into its Capability Maturity Model Integration (CMMI®. This paper will go into the details of how Motorola is addressing this issue. The model that is being developed is designed as an extension of the existing CMMI structure. The assumption is that the audience will have a basic understanding of the SEI CMM® / CMMI® process framework. The paper will not describe implementation details of a security process model or improvement framework, but will address WHAT security practices are required for a company with many organizations operating at different maturity levels. It is left to the implementing organization to answer the HOW, WHEN, WHO and WHERE aspects. The paper will discuss how the model is being implemented in the Motorola Software Group.

  3. Experimental study and modeling of gas diffusion through partially water saturated porous media. Application to Vycor glasses, geo-polymers and CEM V cement pastes

    International Nuclear Information System (INIS)

    Boher, C.

    2012-01-01

    This work documents the relationship that exists between the transfer properties of a material (pore size distribution, total porosity accessible to water, water saturation degree), and its diffusion coefficient. For this sake, materials having a quasi mono modal porosity are used: Vycor glasses and geo-polymers. We also use materials having a complex porosity: CEM V cement pastes. The use of Vycor glasses and geo-polymers allows quantifying the gas diffusion coefficient through materials having known pores size, as a function of their water saturation degree. The use of cement pastes allows checking if it is possible to decompose the diffusion coefficient of a complex porosity material, in an assembling of diffusion coefficients of quasi mono modal porosity materials. For this sake, the impact of pore network arrangement on the diffusion coefficient is studied in great details. This study is divided into three parts:1)Measurement of the geometric characteristics of materials porous network by means of the mercury intrusion porosimetry, water porosimetry, isotherms of nitrogen sorption / desorption, and water desorption tests. 2) Measurement of the materials diffusion coefficient, as a function of their relative humidity storage, and their water saturation degree. 3) Modeling the diffusion coefficient of the materials, and study the impact of the pore network (tortuosity, pores connection). (author) [fr

  4. Generic domain models in software engineering

    Science.gov (United States)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  5. In vitro activity of solithromycin and its metabolites, CEM-214 and N-acetyl-CEM-101, against 100 clinical Ureaplasma spp. isolates compared with azithromycin.

    Science.gov (United States)

    Furfaro, Lucy L; Spiller, O Brad; Keelan, Jeffrey A; Payne, Matthew S

    2015-09-01

    There is a strong association between vaginal and/or amniotic fluid Ureaplasma spp. colonisation and risk of preterm birth. The novel fluoroketolide antibiotic solithromycin (CEM-101) is active against Ureaplasma spp. in vitro. Evidence from ex vivo and in vivo models suggests that, unlike most macrolide antibiotics, solithromycin readily crosses the placenta. Solithromycin metabolism varies according to species; in pregnant sheep, the bioactive metabolites CEM-214 and N-acetyl-CEM-101 (NAc-CEM-101) have been shown to accumulate in the amniotic cavity following maternal solithromycin administration, potentially contributing to its antimicrobial effects. To determine the antimicrobial activity of these metabolites against Ureaplasma spp., the effects of solithromycin, CEM-214, NAc-CEM-101 and the comparator azithromycin were tested on a collection of 100 clinical Ureaplasma spp. isolates from the UK and Australia using a modified 96-well broth microdilution method. MIC90 values observed for the combined cohort were: solithromycin, 0.125 mg/L; CEM-214, 0.5mg/L; NAc-CEM-101, 0.5mg/L; and azithromycin, 2mg/L. Solithromycin showed 34-fold greater activity against Ureaplasma spp. isolates than azithromycin, whilst CEM-214 and NAc-CEM-101 possessed ca. 22% and 17% of the activity of solithromycin, respectively, significantly greater than that of azithromycin. One bacterial isolate showed resistance to azithromycin (MIC=16 mg/L) but had a much lower MIC for solithromycin (MIC=0.25mg/L). In conclusion, the metabolites of solithromycin had reduced, but still potent, activity against 100 clinical Ureaplasma spp. isolates in vitro. This may be important in some instances such as pregnancy, however studies to determine levels of the metabolites in these settings are required. Copyright © 2015 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.

  6. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  7. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  8. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    This study is a direct result of an on-going project to model the reliability of a large real-time control avionics system. In previous modeling efforts with this system, hardware reliability models were applied in modeling the reliability behavior of this system. In an attempt to enhance the performance of the adapted reliability models, certain software attributes were introduced in these models to control for differences between programs and also sequential executions of the same program. As the basic nature of the software attributes that affect software reliability become better understood in the modeling process, this information begins to have important implications on the software development process. A significant problem arises when raw attribute measures are to be used in statistical models as predictors, for example, of measures of software quality. This is because many of the metrics are highly correlated. Consider the two attributes: lines of code, LOC, and number of program statements, Stmts. In this case, it is quite obvious that a program with a high value of LOC probably will also have a relatively high value of Stmts. In the case of low level languages, such as assembly language programs, there might be a one-to-one relationship between the statement count and the lines of code. When there is a complete absence of linear relationship among the metrics, they are said to be orthogonal or uncorrelated. Usually the lack of orthogonality is not serious enough to affect a statistical analysis. However, for the purposes of some statistical analysis such as multiple regression, the software metrics are so strongly interrelated that the regression results may be ambiguous and possibly even misleading. Typically, it is difficult to estimate the unique effects of individual software metrics in the regression equation. The estimated values of the coefficients are very sensitive to slight changes in the data and to the addition or deletion of variables in the

  9. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  10. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  11. A SYSTEMATIC STUDY OF SOFTWARE QUALITY MODELS

    OpenAIRE

    Dr.Vilas. M. Thakare; Ashwin B. Tomar

    2011-01-01

    This paper aims to provide a basis for software quality model research, through a systematic study ofpapers. It identifies nearly seventy software quality research papers from journals and classifies paper asper research topic, estimation approach, study context and data set. The paper results combined withother knowledge provides support for recommendations in future software quality model research, toincrease the area of search for relevant studies, carefully select the papers within a set ...

  12. Evaluating software architecture using fuzzy formal models

    Directory of Open Access Journals (Sweden)

    Payman Behbahaninejad

    2012-04-01

    Full Text Available Unified Modeling Language (UML has been recognized as one of the most popular techniques to describe static and dynamic aspects of software systems. One of the primary issues in designing software packages is the existence of uncertainty associated with such models. Fuzzy-UML to describe software architecture has both static and dynamic perspective, simultaneously. The evaluation of software architecture design phase initiates always help us find some additional requirements, which helps reduce cost of design. In this paper, we use a fuzzy data model to describe the static aspects of software architecture and the fuzzy sequence diagram to illustrate the dynamic aspects of software architecture. We also transform these diagrams into Petri Nets and evaluate reliability of the architecture. The web-based hotel reservation system for further explanation has been studied.

  13. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  14. EPA's Benchmark Dose Modeling Software

    Science.gov (United States)

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  15. The Ragnarok Architectural Software Configuration Management Model

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1999-01-01

    The architecture is the fundamental framework for designing and implementing large scale software, and the ability to trace and control its evolution is essential. However, many traditional software configuration management tools view 'software' merely as a set of files, not as an architecture....... This introduces an unfortunate impedance mismatch between the design domain (architecture level) and configuration management domain (file level.) This paper presents a software configuration management model that allows tight version control and configuration management of the architecture of a software system....... Essential features of the model have been implemented in a research prototype, Ragnarok. Two years of experience using Ragnarok in three, real, small- to medium sized, projects is reported. The conclusion is that the presented model is viable, feels 'natural' for developers, and provides good support...

  16. A flexible modelling software for data acquisition

    International Nuclear Information System (INIS)

    Shu Yantai; Chen Yanhui; Yang Songqi; Liu Genchen

    1992-03-01

    A flexible modelling software for data acquisition is based on an event-driven simulator. It can be used to simulate a wide variety of systems which can be modelled as open queuing networks. The main feature of the software is its flexibility to evaluate the performance of various data acquisition system, whether pulsed or not. The flexible features of this software as follow: The user can choose the number of processors in the model and the route which every job takes to move the model. the service rate of a processor is automatically adapted. The simulator has a pipe-line mechanism. A job can be divided into several segments and a processor may be used as a compression component etc. Some modelling techniques and applications of this software in plasma physics laboratories are also presented

  17. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  18. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  19. Modeling and Selection of Software Service Variants

    OpenAIRE

    Wittern, John Erik

    2015-01-01

    Providers and consumers have to deal with variants, meaning alternative instances of a service?s design, implementation, deployment, or operation, when developing or delivering software services. This work presents service feature modeling to deal with associated challenges, comprising a language to represent software service variants and a set of methods for modeling and subsequent variant selection. This work?s evaluation includes a POC implementation and two real-life use cases.

  20. SOFTWARE SOLUTIONS FOR ARDL MODELS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2015-07-01

    Full Text Available VAR type models can be used only for stationary time series. Causality analyses through econometric models need that series to have the same integrated order. Usually, when constraining the series to comply these restrictions (e.g. by differentiating, economic interpretation of the outcomes may become difficult. Recent solution for mitigating these problems is the use of ARDL (autoregressive distributed lag models. We present implementation in E-Views of these models and we test the impact of exchange rate on consumer price index.

  1. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  2. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  3. Management models in the NZ software industry

    Directory of Open Access Journals (Sweden)

    Holger Spill

    Full Text Available This research interviewed eight innovative New Zealand software companies to find out how they manage new product development. It looked at how management used standard techniques of software development to manage product uncertainty through the theoretical lens of the Cyclic Innovation Model. The study found that while there is considerable variation, the management of innovation was largely determined by the level of complexity. Organizations with complex innovative software products had a more iterative software development style, more flexible internal processes and swifter decision-making. Organizations with less complexity in their products tended to use more formal structured approaches. Overall complexity could be inferred with reference to four key factors within the development environment.

  4. inventory management, VMI, software agents, MDV model

    Directory of Open Access Journals (Sweden)

    Waldemar Wieczerzycki

    2012-03-01

    Full Text Available Background: As it is well know, the implementation of instruments of logistics management is only possible with the use of the latest information technology. So-called agent technology is one of the most promising solutions in this area. Its essence consists in an entirely new way of software distribution on the computer network platform, in which computer exchange among themselves not only data, but also software modules, called just agents. The first aim is to propose the alternative method of the implementation of the concept of the inventory management by the supplier with the use of intelligent software agents, which are able not only to transfer the information but also to make the autonomous decisions based on the privileges given to them. The second aim of this research was to propose a new model of a software agent, which will be both of a high mobility and a high intelligence. Methods: After a brief discussion of the nature of agent technology, the most important benefits of using it to build platforms to support business are given. Then the original model of polymorphic software agent, called Multi-Dimensionally Versioned Software Agent (MDV is presented, which is oriented on the specificity of IT applications in business. MDV agent is polymorphic, which allows the transmission through the network only the most relevant parts of its code, and only when necessary. Consequently, the network nodes exchange small amounts of software code, which ensures high mobility of software agents, and thus highly efficient operation of IT platforms built on the proposed model. Next, the adaptation of MDV software agents to implementation of well-known logistics management instrument - VMI (Vendor Managed Inventory is illustrated. Results: The key benefits of this approach are identified, among which one can distinguish: reduced costs, higher flexibility and efficiency, new functionality - especially addressed to business negotiation, full automation

  5. Bioinactivation: Software for modelling dynamic microbial inactivation.

    Science.gov (United States)

    Garre, Alberto; Fernández, Pablo S; Lindqvist, Roland; Egea, Jose A

    2017-03-01

    This contribution presents the bioinactivation software, which implements functions for the modelling of isothermal and non-isothermal microbial inactivation. This software offers features such as user-friendliness, modelling of dynamic conditions, possibility to choose the fitting algorithm and generation of prediction intervals. The software is offered in two different formats: Bioinactivation core and Bioinactivation SE. Bioinactivation core is a package for the R programming language, which includes features for the generation of predictions and for the fitting of models to inactivation experiments using non-linear regression or a Markov Chain Monte Carlo algorithm (MCMC). The calculations are based on inactivation models common in academia and industry (Bigelow, Peleg, Mafart and Geeraerd). Bioinactivation SE supplies a user-friendly interface to selected functions of Bioinactivation core, namely the model fitting of non-isothermal experiments and the generation of prediction intervals. The capabilities of bioinactivation are presented in this paper through a case study, modelling the non-isothermal inactivation of Bacillus sporothermodurans. This study has provided a full characterization of the response of the bacteria to dynamic temperature conditions, including confidence intervals for the model parameters and a prediction interval of the survivor curve. We conclude that the MCMC algorithm produces a better characterization of the biological uncertainty and variability than non-linear regression. The bioinactivation software can be relevant to the food and pharmaceutical industry, as well as to regulatory agencies, as part of a (quantitative) microbial risk assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  7. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  8. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  9. Model Driven Software Development for Agricultural Robotics

    DEFF Research Database (Denmark)

    Larsen, Morten

    processing, control engineering, etc. This thesis proposes a Model-Driven Software Develop- ment based approach to model, analyse and partially generate the software implementation of a agricultural robot. Furthermore, Guidelines for mod- elling the architecture of an agricultural robots are provided......The design and development of agricultural robots, consists of both mechan- ical, electrical and software components. All these components must be de- signed and combined such that the overall goal of the robot is fulfilled. The design and development of these systems require collaboration between...... mul- tiple engineering disciplines. To this end, architectural specifications can serve as means for communication between different engineering disciplines. Such specifications aid in establishing the interface between the different com- ponents, belonging to different domains such as image...

  10. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  11. A Model for Assessing the Liability of Seemingly Correct Software

    Science.gov (United States)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  12. Memoised Garbage Collection for Software Model Checking

    NARCIS (Netherlands)

    Nguyen, V.Y.; Ruys, T.C.; Kowalewski, S.; Philippou, A.

    Virtual machine based software model checkers like JPF and MoonWalker spend up to half of their veri��?cation time on garbage collection. This is no surprise as after nearly each transition the heap has to be cleaned from garbage. To improve this, this paper presents the Memoised Garbage Collection

  13. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  14. A Descriptive Evaluation of Software Sizing Models

    Science.gov (United States)

    1987-09-01

    compensate for a lack of understanding of a software job to be done. 1.3 REPORT OUTLINE The guiding principle for model selection for this paper was...MODEL SIZE ESTIMATES FOR THE CAiSS SENSITIVITY MODEL MODEL SLOC ESD 37,600+ SPQR 35,910 BYL 22,402 PRICE SZ 21,410 ASSET-R 11,943 SSM 11,700 ASSET-R...disk. ?. Date LS, De fault current date, Re quire ] - ,, ... perffr: an,- 1 ,’ e e st i ma t e. Quantitative inputs Note- Each of the nine quantitative

  15. Analysis of software for modeling atmospheric dispersion

    International Nuclear Information System (INIS)

    Grandamas, O.; Hubert, Ph.; Pages, P.

    1989-09-01

    During last few years, a number software packages for microcomputes have appeared with the aim to simulate diffusion of atmospheric pollutants. These codes, simplifying the models used for safety analyses of industrial plants are becoming more useful, and are even used for post-accidental conditions. The report presents for the first time in a critical manner, principal models available up to this date. The problem arises in adapting the models to the demanded post-accidental interventions. In parallel to this action an analysis of performance was performed. It means, identifying the need of forecasting the most appropriate actions to be performed having in mind short available time and lack of information. Because of these difficulties, it is possible to simplify the software, which will not include all the options but could deal with a specific situation. This would enable minimisation of data to be collected on the site [fr

  16. Herramientas libres para modelar software Free tools to model software

    Directory of Open Access Journals (Sweden)

    Mauro Callejas Cuervo Óscar Yovany Baquero Moreno

    2010-11-01

    Full Text Available Observación acerca del  software libre y de suimplicación en procesos de desarrollo de  softwarecon herramientas 4G por parte de entidades opersonas sin capitales astronómicos y sin lamentalidad acaparadora de dominar el mercado conproductos costosos que las hagan multimillonarias yque no ofrecen una garantía real, ni la posibilidadsiquiera de conocer el  software por el que se hapagado, y mucho menos de modificarlo si no cumplenuestras expectativas.

  17. Artificial Intelligence Software Engineering (AISE) model

    Science.gov (United States)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  18. SOFTWARE DEVELOPMENT MODEL FOR ETHNOBILINGUAL DICTIONARIES

    Directory of Open Access Journals (Sweden)

    Melchora Morales-Sánchez

    2010-09-01

    Full Text Available A software development integral model for a dictionary to store and retrieve textual, visual, and most important, incorporating the audio of oral language. Taking into account both the characterization of indigenous cultural reality and the technical aspects of software construction. Such model consists of the next phases: context description, lexicographic design, computer design and multimedia, construction and tests of the application. There isn´t doubt about the influence of the contact of Spanish language with the variety of languages spoken throughout Latin-America causing the most diverse and extensive communications. Causing that in the interior of communities are interested in preserving their language tongue for people to identify themselves with their own roots and transmit this legacy to the next generations. The model its design to develop dictionary software with factors that are certain in indigenous reality as they are: low budget, functioning in computers with limited resources and human resources with minimum capabilities. And is exemplified with the development of a Spanish-chatino dictionary spoken in the town of Santos Reyes Nopala, Oaxaca in the coast region of Mexico.

  19. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2006-01-01

    As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

  20. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  1. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  2. Agile Maturity Model (AMM): A Software Process Improvement framework for Agile Software Development Practices

    OpenAIRE

    Chetankumar Patel; Muthu Ramachandran

    2009-01-01

    Agile software development methodologies have introduced best practices into software development. However we need to adopt and monitor those practices continuously to maximize its benefits. Our research has focused on adaptability, suitability and software maturity model called Agile Maturity Model (AMM) for agile software development environments. This paper introduces a process of adaptability assessment, suitability assessment, and improvement framework for assessing and improving agile b...

  3. Reaction Wheel Disturbance Model Extraction Software - RWDMES

    Science.gov (United States)

    Blaurock, Carl

    2009-01-01

    The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral

  4. Process model for building quality software on internet time ...

    African Journals Online (AJOL)

    The competitive nature of the software construction market and the inherently exhilarating nature of software itself have hinged the success of any software development project on four major pillars: time to market, product quality, innovation and documentation. Unfortunately, however, existing software development models ...

  5. Development of a software for the curimeter model cdn102

    International Nuclear Information System (INIS)

    Dotres Llera, Armando

    2001-01-01

    The characteristics of the software for the Curimeter Model CD-N102 developed at CEADEN are presented. The software consists of two main parts: a basic software for the electrometer block and an application software for a P C. The basic software is totally independent of the Pc and performs all the basic functions of the process of measurement. The application software is optional and offers a friendlier interface and additional options to the user. Among these is the possibility to keep a statistical record of the measurements in a database, to create labels and to introduce new isotopes and calibrate them. A more detailed explanation of both software is given

  6. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  7. Characterization of tin phosphate coatings by CEMS

    International Nuclear Information System (INIS)

    Nomura, Kiyoshi; Ujihira, Yusuke; Takai, Osamu; Kojima, Ryuji

    1992-01-01

    The structure and chemical state of tin in converted tin phosphate coatings, obtained by a treatment of Zn and Mn phosphate in SnCl 2 solution, were characterized by CEMS. Converted Sn(II) phosphate and adsorbed SnO 2 species were main products in the ∝1/3 top layers of Mn and Zn phosphate coatings, and metallic tin was occasionally recognized in deeper layers. Tin phosphate layers, coated directly on a steel substrate by RF sputtering of Ar ions, were composed of two kinds of Sn(IV) species. (orig.)

  8. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  9. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  10. Radiological modeling software for underground uranium mines

    International Nuclear Information System (INIS)

    Bjorndal, B.; Moridi, R.

    1999-01-01

    The Canadian Institute for Radiation Safety (CAIRS) has developed computer simulation software for modeling radiological parameters in underground uranium mines. The computer program, called 3d RAD, allows radiation protection professionals and mine ventilation engineers to quickly simulate radon and radon progeny activity concentrations and potential alpha energy concentrations in complex mine networks. The simulation component of 3d RAD, called RSOLVER, is an adaptation of an existing modeling program called VENTRAD, originally developed at Queen's University, Ontario. Based on user defined radiation source terms and network physical properties, radiological parameters in the network are calculated iteratively by solving Bateman's Equations in differential form. The 3d RAD user interface was designed in cooperation with the Canada Centre for Mineral and Energy Technology (CANMET) to improve program functionality and to make 3d RAD compatible with the CANMET ventilation simulation program, 3d CANVENT. The 3d RAD program was tested using physical data collected in Canadian uranium mines. 3d RAD predictions were found to agree well with theoretical calculations and simulation results obtained from other modeling programs such as VENTRAD. Agreement with measured radon and radon progeny levels was also observed. However, the level of agreement was found to depend heavily on the precision of source term data, and on the measurement protocol used to collect radon and radon progeny levels for comparison with the simulation results. The design and development of 3d RAD was carried out under contract with the Saskatchewan government

  11. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business....... This thesis documents the groundwork towards addressing the challenges faced by telemedical technologies today and establishing telemedicine as a means of patient diagnosis and treatment. Furthermore, it serves as an empirical example of designing a software ecosystem....

  12. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  13. Example of software configuration management model

    International Nuclear Information System (INIS)

    Roth, P.

    2006-01-01

    Software configuration management is the mechanism used to track and control software changes and may include the following actions: A tracking system should be established for any changes made to the existing software configuration. Requirement of the configuration management system are the following: - Backup the different software configuration; - Record the details (the date, the subject, the filenames, the supporting documents, the tests, ...) of the changes introduced in the new configuration; - Document all the differences between the different versions. Configuration management allows simultaneous exploitation of one specific version and development of the next version. Minor correction can be perform in the current exploitation version

  14. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set......Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...

  15. Traceability for Model Driven, Software Product Line Engineering

    NARCIS (Netherlands)

    Anquetil, N.; Grammel, B.; Galvao, I.; Noppen, J.A.R.; Shakil Khan, S.; Arboleda, H.; Rashid, A.; Garcia, A.

    Traceability is an important challenge for software organizations. This is true for traditional software development and even more so in new approaches that introduce more variety of artefacts such as Model Driven development or Software Product Lines. In this paper we look at some aspect of the

  16. An improved COCOMO software cost estimation model | Duke ...

    African Journals Online (AJOL)

    In this paper, we discuss the methodologies adopted previously in software cost estimation using the COnstructive COst MOdels (COCOMOs). From our analysis, COCOMOs produce very high software development efforts, which eventually produce high software development costs. Consequently, we propose its extension, ...

  17. A multi-layered software architecture model for building software solutions in an urbanized information system

    Directory of Open Access Journals (Sweden)

    Sana Guetat

    2013-01-01

    Full Text Available The concept of Information Systems urbanization has been proposed since the late 1990’s in order to help organizations building agile information systems. Nevertheless, despite the advantages of this concept, it remains too descriptive and presents many weaknesses. In particular, there is a lack of useful architecture models dedicated to defining software solutions compliant with information systems urbanization principles and rules. Moreover, well-known software architecture models do not provide sufficient resources to address the requirements and constraints of urbanized information systems. In this paper, we draw on the “information city” framework to propose a model of software architecture - called the 5+1 Software Architecture Model - which is compliant with information systems urbanization principles and helps organizations building urbanized software solutions. This framework improves the well-established software architecture models and allows the integration of new architectural paradigms. Furthermore, the proposed model contributes to the implementation of information systems urbanization in several ways. On the one hand, this model devotes a specific layer to applications integration and software reuse. On the other hand, it contributes to the information system agility and scalability due to its conformity to the separation of concerns principle.

  18. Model-driven and software product line engineering

    CERN Document Server

    Royer, Jean-Claude

    2013-01-01

    Many approaches to creating Software Product Lines have emerged that are based on Model-Driven Engineering. This book introduces both Software Product Lines and Model-Driven Engineering, which have separate success stories in industry, and focuses on the practical combination of them. It describes the challenges and benefits of merging these two software development trends and provides the reader with a novel approach and practical mechanisms to improve software development productivity.The book is aimed at engineers and students who wish to understand and apply software product lines

  19. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  20. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  1. dMODELS: A software package for modeling volcanic deformation

    Science.gov (United States)

    Battaglia, Maurizio

    2017-04-01

    dMODELS is a software package that includes the most common source models used to interpret deformation measurements near active volcanic centers. The emphasis is on estimating the parameters of analytical models of deformation by inverting data from the Global Positioning System (GPS), Interferometric Synthetic Aperture Radar (InSAR), tiltmeters and strainmeters. Source models include: (a) pressurized spherical, ellipsoidal and sill-like magma chambers in an elastic, homogeneous, flat half-space; (b) pressurized spherical magma chambers with topography corrections; and (c) the solutions for a dislocation (fracture) in an elastic, homogeneous, flat half-space. All of the equations have been extended to include deformation and strain within the Earth's crust (as opposed to only at the Earth's surface) and verified against finite element models. Although actual volcanic sources are not embedded cavities of simple shape, we assume that these models may reproduce the stress field created by the actual magma intrusion or hydrothermal fluid injection. The dMODELS software employs a nonlinear inversion algorithm to determine the best-fit parameters for the deformation source by searching for the minimum of the cost function χv2 (chi square per degrees of freedom). The non-linear inversion algorithm is a combination of local optimization (interior-point method) and random search. This approach is more efficient for hyper-parameter optimization than trials on a grid. The software has been developed using MATLAB, but compiled versions that can be run using the free MATLAB Compiler Runtime (MCR) module are available for Windows 64-bit operating systems. The MATLAB scripts and compiled files are open source and intended for teaching and research. The software package includes both functions for forward modeling and scripts for data inversion. A software demonstration will be available during the meeting. You are welcome to contact the author at mbattaglia@usgs.gov for

  2. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  3. PERFORMANCE EVALUATION OF 3D MODELING SOFTWARE FOR UAV PHOTOGRAMMETRY

    OpenAIRE

    H. Yanagi; H. Yanagi; H. Chikatsu

    2016-01-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algori...

  4. Generating Protocol Software from CPN Models Annotated with Pragmatics

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars M.; Kindler, Ekkart

    2013-01-01

    Model-driven software engineering (MDSE) provides a foundation for automatically generating software based on models that focus on the problem domain while abstracting from the details of underlying implementation platforms. Coloured Petri Nets (CPNs) have been widely used to formally model and v...... data framing protocol....

  5. Extracting software static defect models using data mining

    Directory of Open Access Journals (Sweden)

    Ahmed H. Yousef

    2015-03-01

    Full Text Available Large software projects are subject to quality risks of having defective modules that will cause failures during the software execution. Several software repositories contain source code of large projects that are composed of many modules. These software repositories include data for the software metrics of these modules and the defective state of each module. In this paper, a data mining approach is used to show the attributes that predict the defective state of software modules. Software solution architecture is proposed to convert the extracted knowledge into data mining models that can be integrated with the current software project metrics and bugs data in order to enhance the prediction. The results show better prediction capabilities when all the algorithms are combined using weighted votes. When only one individual algorithm is used, Naïve Bayes algorithm has the best results, then the Neural Network and the Decision Trees algorithms.

  6. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  7. A Model for Joint Software Reviews

    Science.gov (United States)

    1998-10-01

    or more specified systems. Figure 5: Quality Characteristics [ ISO /IEC 9126 -1, 1996] Despite the lack of prior study and classification of IR issues...Company. [ ISO /IEC 9126 -1, 1996] Information Technology - Software quality characteristics and metrics - Part 1: Quality characteristics and sub...characteristics, Standard (No. ISO /IEC 9126 -1). [ ISO /IEC 12207, 1995] Information Technology Software Life Cycle Processes, Standard (No. ISO /IEC 12207

  8. Beyond Reactive Planning: Self Adaptive Software and Self Modeling Software in Predictive Deliberation Management

    National Research Council Canada - National Science Library

    Lenahan, Jack; Nash, Michael P; Charles, Phil

    2008-01-01

    .... We present the following hypothesis: predictive deliberation management using self-adapting and self-modeling software will be required to provide mission planning adjustments after the start of a mission...

  9. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  10. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  11. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    OpenAIRE

    Jump, David

    2014-01-01

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing...

  12. Development of an Environment for Software Reliability Model Selection

    Science.gov (United States)

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  13. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  14. Capability Maturity Model (CMM) for Software Process Improvements

    Science.gov (United States)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  15. Software reliability growth model for safety systems of nuclear reactor

    International Nuclear Information System (INIS)

    Thirugnana Murthy, D.; Murali, N.; Sridevi, T.; Satya Murty, S.A.V.; Velusamy, K.

    2014-01-01

    The demand for complex software systems has increased more rapidly than the ability to design, implement, test, and maintain them, and the reliability of software systems has become a major concern for our, modern society.Software failures have impaired several high visibility programs in space, telecommunications, defense and health industries. Besides the costs involved, it setback the projects. The ways of quantifying it and using it for improvement and control of the software development and maintenance process. This paper discusses need for systematic approaches for measuring and assuring software reliability which is a major share of project development resources. It covers the reliability models with the concern on 'Reliability Growth'. It includes data collection on reliability, statistical estimation and prediction, metrics and attributes of product architecture, design, software development, and the operational environment. Besides its use for operational decisions like deployment, it includes guiding software architecture, development, testing and verification and validation. (author)

  16. Software for Mathematical Modeling of Plastic Deformation in FCC Metals

    Science.gov (United States)

    Petelin, A. E.; Eliseev, A. S.

    2017-08-01

    The question on the necessity of software implementation in the study of plastic deformation in FCC metals with the use of mathematical modeling methods is investigated. This article describes the implementation features and the possibility of using the software Dislocation Dynamics of Crystallographic Slip (DDCS). The software has an advanced user interface and is designed for users without an extensive experience in IT-technologies. Parameter values of the mathematical model, obtained from field experiments and accumulated in a special database, are used in DDCS to carry out computational experiments. Moreover, the software is capable of accumulating bibliographic information used in research.

  17. Modeling Software Evolution using Algebraic Graph Rewriting

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Avgeriou, P.; Zdun, U.; Borne, I.

    We show how evolution requests can be formalized using algebraic graph rewriting. In particular, we present a way to convert the UML class diagrams to colored graphs. Since changes in software may effect the relation between the methods of classes, our colored graph representation also employs the

  18. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  19. Software for medical image based phantom modelling

    International Nuclear Information System (INIS)

    Possani, R.G.; Massicano, F.; Coelho, T.S.; Yoriyaz, H.

    2011-01-01

    Latest treatment planning systems depends strongly on CT images, so the tendency is that the dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) or computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET or SPECT. This information associated with the simulation of radiation transport software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran- 77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  20. Software to Enable Modeling & Simulation as a Service

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a Modeling and Simulation as a Service (M&SaaS) software service infrastructure to enable most modeling and simulation (M&S) activities to be...

  1. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  2. SET-MM – A Software Evaluation Technology Maturity Model

    OpenAIRE

    García-Castro, Raúl

    2011-01-01

    The application of software evaluation technologies in different research fields to verify and validate research is a key factor in the progressive evolution of those fields. Nowadays, however, to have a clear picture of the maturity of the technologies used in evaluations or to know which steps to follow in order to improve the maturity of such technologies is not easy. This paper describes a Software Evaluation Technology Maturity Model that can be used to assess software evaluation tech...

  3. Model for Agile Software Development Performance Monitoring

    OpenAIRE

    Žabkar, Nataša

    2013-01-01

    Agile methodologies have been in use for more than ten years and during this time they proved to be efficient, even though number of empirical research is scarce, especially regarding agile software development performance monitoring. The most popular agile framework Scrum is using only one measure of performance: the amount of work remaining for implementation of User Story from the Product Backlog or for implementation of Task from the Sprint Backlog. In time the need for additional me...

  4. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  5. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  6. Software engineering with process algebra: Modelling client / server architecures

    NARCIS (Netherlands)

    Diertens, B.

    2009-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the

  7. The scientific modeling assistant: An advanced software tool for scientific model building

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  8. A Study On Traditional And Evolutionary Software Development Models

    Directory of Open Access Journals (Sweden)

    Kamran Rasheed

    2017-07-01

    Full Text Available Today Computing technologies are becoming the pioneers of the organizations and helpful in individual functionality i.e. added to computing device we need to add softwares. Set of instruction or computer program is known as software. The development of software is done through some traditional or some new or evolutionary models. Software development is becoming a key and a successful business nowadays. Without software all hardware is useless. Some collective steps that are performed in the development of these are known as Software development life cycle SDLC. There are some adaptive and predictive models for developing software. Predictive mean already known like WATERFALL Spiral Prototype and V-shaped models while Adaptive model include agile Scrum. All methodologies of both adaptive and predictive have their own procedure and steps. Predictive are Static and Adaptive are dynamic mean change cannot be made to the predictive while adaptive have the capability of changing. The purpose of this study is to get familiar with all these and discuss their uses and steps of development. This discussion will be helpful in deciding which model they should use in which circumstance and what are the development step including in each model.

  9. Transformation of UML Behavioral Diagrams to Support Software Model Checking

    Directory of Open Access Journals (Sweden)

    Luciana Brasil Rebelo dos Santos

    2014-04-01

    Full Text Available Unified Modeling Language (UML is currently accepted as the standard for modeling (object-oriented software, and its use is increasing in the aerospace industry. Verification and Validation of complex software developed according to UML is not trivial due to complexity of the software itself, and the several different UML models/diagrams that can be used to model behavior and structure of the software. This paper presents an approach to transform up to three different UML behavioral diagrams (sequence, behavioral state machines, and activity into a single Transition System to support Model Checking of software developed in accordance with UML. In our approach, properties are formalized based on use case descriptions. The transformation is done for the NuSMV model checker, but we see the possibility in using other model checkers, such as SPIN. The main contribution of our work is the transformation of a non-formal language (UML to a formal language (language of the NuSMV model checker towards a greater adoption in practice of formal methods in software development.

  10. Study of the nonlinear imperfect software debugging model

    International Nuclear Information System (INIS)

    Wang, Jinyong; Wu, Zhibo

    2016-01-01

    In recent years there has been a dramatic proliferation of research on imperfect software debugging phenomena. Software debugging is a complex process and is affected by a variety of factors, including the environment, resources, personnel skills, and personnel psychologies. Therefore, the simple assumption that debugging is perfect is inconsistent with the actual software debugging process, wherein a new fault can be introduced when removing a fault. Furthermore, the fault introduction process is nonlinear, and the cumulative number of nonlinearly introduced faults increases over time. Thus, this paper proposes a nonlinear, NHPP imperfect software debugging model in consideration of the fact that fault introduction is a nonlinear process. The fitting and predictive power of the NHPP-based proposed model are validated through related experiments. Experimental results show that this model displays better fitting and predicting performance than the traditional NHPP-based perfect and imperfect software debugging models. S-confidence bounds are set to analyze the performance of the proposed model. This study also examines and discusses optimal software release-time policy comprehensively. In addition, this research on the nonlinear process of fault introduction is significant given the recent surge of studies on software-intensive products, such as cloud computing and big data. - Highlights: • Fault introduction is a nonlinear changing process during the debugging phase. • The assumption that the process of fault introduction is nonlinear is credible. • Our proposed model can better fit and accurately predict software failure behavior. • Research on fault introduction case is significant to software-intensive products.

  11. Models of the atomic nucleus. With interactive software

    International Nuclear Information System (INIS)

    Cook, N.D.

    2006-01-01

    This book-and-CD-software package supplies users with an interactive experience for nuclear visualization via a computer-graphical interface, similar in principle to the molecular visualizations already available in chemistry. Models of the Atomic Nucleus, a largely non-technical introduction to nuclear theory, explains the nucleus in a way that makes nuclear physics as comprehensible as chemistry or cell biology. The book/software supplements virtually any of the current textbooks in nuclear physics by providing a means for 3D visual display of the diverse models of nuclear structure. For the first time, an easy-to-master software for scientific visualization of the nucleus makes this notoriously ''non-visual'' field become immediately 'visible.' After a review of the basics, the book explores and compares the competing models, and addresses how the lattice model best resolves remaining controversies. The appendix explains how to obtain the most from the software provided on the accompanying CD. (orig.)

  12. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  13. A Reference Model for Mobile Social Software for Learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2007-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model for mobile social software for learning. International Journal of Continuing Engineering Education and Life-Long Learning, 18(1), 118-138.

  14. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  15. Model-Driven Software Evolution : A Research Agenda

    NARCIS (Netherlands)

    Van Deursen, A.; Visser, E.; Warmer, J.

    2007-01-01

    Software systems need to evolve, and systems built using model-driven approaches are no exception. What complicates model-driven engineering is that it requires multiple dimensions of evolution. In regular evolution, the modeling language is used to make the changes. In meta-model evolution, changes

  16. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  17. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  18. A CRF-based system for recognizing chemical entity mentions (CEMs) in biomedical literature.

    Science.gov (United States)

    Xu, Shuo; An, Xin; Zhu, Lijun; Zhang, Yunliang; Zhang, Haodong

    2015-01-01

    In order to improve information access on chemical compounds and drugs (chemical entities) described in text repositories, it is very crucial to be able to identify chemical entity mentions (CEMs) automatically within text. The CHEMDNER challenge in BioCreative IV was specially designed to promote the implementation of corresponding systems that are able to detect mentions of chemical compounds and drugs, which has two subtasks: CDI (Chemical Document Indexing) and CEM. Our system processing pipeline consists of three major components: pre-processing (sentence detection, tokenization), recognition (CRF-based approach), and post-processing (rule-based approach and format conversion). In our post-challenge system, the cost parameter in CRF model was optimized by 10-fold cross validation with grid search, and word representations feature induced by Brown clustering method was introduced. For the CEM subtask, our official runs were ranked in top position by obtaining maximum 88.79% precision, 69.08% recall and 77.70% balanced F-measure, which were improved further to 88.43% precision, 76.48% recall and 82.02% balanced F-measure in our post-challenge system. In our system, instead of extracting a CEM as a whole, we regarded it as a sequence labeling problem. Though our current system has much room for improvement, our system is valuable in showing that the performance in term of balanced F-measure can be improved largely by utilizing large amounts of relatively inexpensive un-annotated PubMed abstracts and optimizing the cost parameter in CRF model. From our practice and lessons, if one directly utilizes some open-source natural language processing (NLP) toolkits, such as OpenNLP, Standford CoreNLP, false positive (FP) rate may be very high. It is better to develop some additional rules to minimize the FP rate if one does not want to re-train the related models. Our CEM recognition system is available at: http://www.SciTeMiner.org/XuShuo/Demo/CEM.

  19. A CASE STUDY OF MODELING A TORUS IN DIFFERENT MODELING SOFTWARES

    OpenAIRE

    VELJKOVIĆ Milica; KRASIĆ Sonja; PEJIĆ Petar; TOŠIĆ Zlata

    2017-01-01

    Modeling of the complex geometric shapes requires the use of appropriate softwares. This study analyzes the process of modeling with two different computer softwares, AutoCAD and Rhinoceros. The aim is to demonstrate the similarities and differences between these softwares when used for modeling torus, a double curved geometric surface. The two modeling processes are compared in order to investigate the potentials of these softwares in the modeling of an architectural structure comprising a s...

  20. A New Software Quality Model for Evaluating COTS Components

    OpenAIRE

    Adnan Rawashdeh; Bassem Matalkah

    2006-01-01

    Studies show that COTS-based (Commercial off the shelf) systems that are being built recently are exceeding 40% of the total developed software systems. Therefore, a model that ensures quality characteristics of such systems becomes a necessity. Among the most critical processes in COTS-based systems are the evaluation and selection of the COTS components. There are several existing quality models used to evaluate software systems in general; however, none of them is dedicated to COTS-based s...

  1. Staying in the Light: Evaluating Sustainability Models for Brokering Software

    Science.gov (United States)

    Powers, L. A.; Benedict, K. K.; Best, M.; Fyfe, S.; Jacobs, C. A.; Michener, W. K.; Pearlman, J.; Turner, A.; Nativi, S.

    2015-12-01

    The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and obstacles, and policy and legal considerations. The issue of sustainability is not unique to brokering software and these models may be relevant to many applications. Results of this comprehensive analysis highlight advantages and disadvantages of the various models in respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis while recognizing that all software is part of an evolutionary process and has a lifespan.

  2. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  3. Creating a simulation model of software testing using Simulink package

    Directory of Open Access Journals (Sweden)

    V. M. Dubovoi

    2016-12-01

    Full Text Available The determination of the solution model of software testing that allows prediction both the whole process and its specific stages is actual for IT-industry. The article focuses on solving this problem. The aim of the article is prediction the time and improvement the quality of software testing. The analysis of the software testing process shows that it can be attributed to the branched cyclic technological processes because it is cyclical with decision-making on control operations. The investigation uses authors' previous works andsoftware testing process method based on Markov model. The proposed method enables execution the prediction for each software module, which leads to better decision-making of each controlled suboperation of all processes. Simulink simulation model shows implementation and verification of results of proposed technique. Results of the research have practically implemented in the IT-industry.

  4. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    Science.gov (United States)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  5. PERFORMANCE EVALUATION OF 3D MODELING SOFTWARE FOR UAV PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    H. Yanagi

    2016-06-01

    Full Text Available UAV (Unmanned Aerial Vehicle photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  6. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  7. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  8. Aspect-Oriented Model-Driven Software Product Line Engineering

    Science.gov (United States)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  9. Business Model Exploration for Software Defined Networks

    NARCIS (Netherlands)

    Xu, Yudi; Jansen, Slinger; España, Sergio; Zhang, Dong; Gao, Xuesong

    2017-01-01

    Business modeling is becoming a foundational process in the information technology industry. Many ICT companies are constructing their business models to stay competitive on the cutting edge of the technology world. However, when comes to new technologies or emerging markets, it remains difficult

  10. Capabilities and accuracy of energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-11-01

    Full Text Available Energy modelling can be used in a number of different ways to fulfill different needs, including certification within building regulations or green building rating tools. Energy modelling can also be used in order to try and predict what the energy...

  11. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  12. The STAMP Software for State Space Models

    Directory of Open Access Journals (Sweden)

    Roy Mendelssohn

    2011-05-01

    Full Text Available This paper reviews the use of STAMP (Structural Time Series Analyser, Modeler and Predictor for modeling time series data using state-space methods with unobserved components. STAMP is a commercial, GUI-based program that runs on Windows, Linux and Macintosh computers as part of the larger OxMetrics System. STAMP can estimate a wide-variety of both univariate and multivariate state-space models, provides a wide array of diagnostics, and has a batch mode capability. The use of STAMP is illustrated for the Nile river data which is analyzed throughout this issue, as well as by modeling a variety of oceanographic and climate related data sets. The analyses of the oceanographic and climate data illustrate the breadth of models available in STAMP, and that state-space methods produce results that provide new insights into important scientific problems.

  13. Building a Flexible Software Factory Using Partial Domain Specific Models

    NARCIS (Netherlands)

    Warmer, J.B.; Kleppe, A.G.

    2006-01-01

    This paper describes some experiences in building a software factory by defining multiple small domain specific languages (DSLs) and having multiple small models per DSL. This is in high contrast with traditional approaches using monolithic models, e.g. written in UML. In our approach, models behave

  14. Introduction to Financial Projection Models. Business Management Instructional Software.

    Science.gov (United States)

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  15. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2014-01-01

    Highly recommended by JASA, Technometrics, and other journals, the first edition of this bestseller showed how to easily perform complex linear mixed model (LMM) analyses via a variety of software programs. Linear Mixed Models: A Practical Guide Using Statistical Software, Second Edition continues to lead readers step by step through the process of fitting LMMs. This second edition covers additional topics on the application of LMMs that are valuable for data analysts in all fields. It also updates the case studies using the latest versions of the software procedures and provides up-to-date information on the options and features of the software procedures available for fitting LMMs in SAS, SPSS, Stata, R/S-plus, and HLM.New to the Second Edition A new chapter on models with crossed random effects that uses a case study to illustrate software procedures capable of fitting these models Power analysis methods for longitudinal and clustered study designs, including software options for power analyses and suggest...

  16. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  17. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  18. Graphical modelling software in R - status

    DEFF Research Database (Denmark)

    Dethlefsen, Claus; Højsgaard, Søren; Lauritzen, Steffen L.

    2007-01-01

    , and Kreiner 1995), MIM (Edwards  2000), and Tetrad (Glymour, Scheines, Spirtes, and Kelley 1987). The gR initiative (Lauritzen 2002) aims at making graphical models available in R (R Development Core Team 2006). A small grant from the Danish Science Foundation supported this initiative. We will summarize...... the results of the initiative so far. Specifically we will illustrate some of the R packages for graphical modelling currently on CRAN and discuss their strengths and weaknesses....

  19. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent......, or by ignoring the causes. This substitutes manual reviews to some extent. The concepts, implemented in a tool, have been validated with design patterns, refactorings, and domain level tests that comprise a replay of a real project. This proves the applicability of the solution to realistic examples...

  20. Presenting results of software model checker via debugging interface

    OpenAIRE

    Kohan, Tomáš

    2012-01-01

    Title: Presenting results of software model checker via debugging interface Author: Tomáš Kohan Department: Department of Software Engineering Supervisor of the master thesis: RNDr. Ondřej Šerý, Ph.D., Department of Distributed and Dependable Systems Abstract: This thesis is devoted to design and implementation of the new debugging interface of the Java PathFinder application. As a suitable inte- face container was selected the Eclipse development environment. The created interface should vis...

  1. CEMS and DCEMS investigations of Al-implanted iron

    International Nuclear Information System (INIS)

    Reuther, H.; Nikolov, O.; Kruijer, S.; Scherrer, S.

    1994-01-01

    α-Fe surfaces were implanted with a nominal dose of 5 x 10 17 Al ions/cm 2 at 50 keV and a current density of about 3.7 μA/cm 2 . Samples of different shapes and thicknesses have been used in order to test the influence of heat flow from specimen to target holder during implantation. Integral and energy differential (''depth-selective'') 57 Fe conversion electron Moessbauer spectroscopy (CEMS and DCEMS) were employed. The spectra indicated a magnetic phase characterised by a broad hyperfine field distribution P(B hf ), a non-magnetic phase, and α-Fe. The relative intensity of the non-magnetic phase was enhanced if the thermal contact during implantation became worse. An energy dependence of DCEM spectra in the L-electron range was observed. Model calculations using L-electron weight functions and experimental concentration profiles obtained by secondary neutral mass spectroscopy (SNMS) yielded fair agreement between calculated and experimental phase signals. The results demonstrate that the non-magnetic Fe-Al alloy phase with high Al concentration is located closer to the surface than the magnetic alloy phase, which extends to much larger depth than expected. (orig.)

  2. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  3. CE-MS fingerprinting of Laurencia complex algae (Rhodophyta).

    Science.gov (United States)

    Machín-Sánchez, María; Asensio-Ramos, María; Hernández-Borges, Javier; Gil-Rodríguez, María Candelaria

    2014-03-01

    The use of CE-ESI-MS has been considered as a new chemical strategy for the possible discernment of genera and species of the Laurencia complex. After the selection of the CE-MS and the extraction conditions, a total of 28 specimens of the complex, including different species of four genera (Laurencia, Laurenciella, Palisada, and Osmundea) collected from five intertidal locations on the Island of Tenerife (Canary Islands, Spain) were analyzed. CE-MS fingerprints revealed that CE-MS can be used as a useful tool for these studies in order to assess similarities and differences between them and that it constitutes an important starting point for further studies in the field. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The laws of software process a new model for the production and management of software

    CERN Document Server

    Armour, Phillip G

    2003-01-01

    The Nature of Software and The Laws of Software ProcessA Brief History of KnowledgeThe Characteristics of Knowledge Storage MediaThe Nature of Software DevelopmentThe Laws of Software Process and the Five Orders of IgnoranceThe Laws of Software ProcessThe First Law of Software ProcessThe Corollary to the First Law of Software ProcessThe Reflexive Creation of Systems and ProcessesThe Lemma of Eternal LatenessThe Second Law of Software ProcessThe Rule of Process BifurcationThe Dual Hypotheses of Knowledge DiscoveryArmour's Observation on Software ProcessThe Third Law of Software Process (also kn

  5. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  6. Coevolution of variability models and related software artifacts

    DEFF Research Database (Denmark)

    Passos, Leonardo; Teixeira, Leopoldo; Dinztner, Nicolas

    2015-01-01

    models coevolve with other artifact types, we study a large and complex real-world variant-rich software system: the Linux kernel. Specifically, we extract variability-coevolution patterns capturing changes in the variability model of the Linux kernel with subsequent changes in Makefiles and C source...

  7. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  8. Invention software support by integrating function and mathematical modeling

    NARCIS (Netherlands)

    Chechurin, L.S.; Wits, Wessel Willems; Bakker, H.M.

    2015-01-01

    New idea generation is imperative for successful product innovation and technology development. This paper presents the development of a novel type of invention support software. The support tool integrates both function modeling and mathematical modeling, thereby enabling quantitative analyses on a

  9. Application of Process Modeling in a Software- Engineering Course

    Directory of Open Access Journals (Sweden)

    Gabriel Alberto García Mireles

    2001-11-01

    Full Text Available Coordination in a software development project is a critical issue in delivering a successful software product, within the constraints of time, functionality and budget agreed upon with the customer. One of the strategies for approaching this problem consists in the use of process modeling to document, evaluate, and redesign the software development process. The appraisal of the projects done in the Engineering and Methodology course of a program given at the Ensenada Center of Scientific Research and Higher Education (CICESE, from a process perspective, facilitated the identification of strengths and weaknesses in the development process used. This paper presents the evaluation of the practical portion of the course, the improvements made, and the preliminary results of using the process approach in the analysis phase of a software-development project.

  10. Chinshan living PRA model using NUPRA software package

    International Nuclear Information System (INIS)

    Cheng, S.-K.; Lin, T.-J.

    2004-01-01

    A living probabilistic risk assessment (PRA) model has been established for Chinshan Nuclear Power Station (BWR-4, MARK-I) using NUPRA software package. The core damage frequency due to internal events, seismic events and typhoons are evaluated in this model. The methodology and results considering the recent implementation of the 5th emergency diesel generator and automatic boron injection function are presented. The dominant sequences of this PRA model are discussed, and some possible applications of this living model are proposed. (author)

  11. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  12. Aspects of system modelling in Hardware/Software partitioning

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1996-01-01

    This paper addresses fundamental aspects of system modelling and partitioning algorithms in the area of Hardware/Software Codesign. Three basic system models for partitioning are presented and the consequences of partitioning according to each of these are analyzed. The analysis shows the importa......This paper addresses fundamental aspects of system modelling and partitioning algorithms in the area of Hardware/Software Codesign. Three basic system models for partitioning are presented and the consequences of partitioning according to each of these are analyzed. The analysis shows...... the importance of making a clear distinction between the model used for partitioning and the model used for evaluation It also illustrates the importance of having a realistic hardware model such that hardware sharing can be taken into account. Finally, the importance of integrating scheduling and allocation...

  13. Software Piracy Detection Model Using Ant Colony Optimization Algorithm

    Science.gov (United States)

    Astiqah Omar, Nor; Zakuan, Zeti Zuryani Mohd; Saian, Rizauddin

    2017-06-01

    Internet enables information to be accessible anytime and anywhere. This scenario creates an environment whereby information can be easily copied. Easy access to the internet is one of the factors which contribute towards piracy in Malaysia as well as the rest of the world. According to a survey conducted by Compliance Gap BSA Global Software Survey in 2013 on software piracy, found out that 43 percent of the software installed on PCs around the world was not properly licensed, the commercial value of the unlicensed installations worldwide was reported to be 62.7 billion. Piracy can happen anywhere including universities. Malaysia as well as other countries in the world is faced with issues of piracy committed by the students in universities. Piracy in universities concern about acts of stealing intellectual property. It can be in the form of software piracy, music piracy, movies piracy and piracy of intellectual materials such as books, articles and journals. This scenario affected the owner of intellectual property as their property is in jeopardy. This study has developed a classification model for detecting software piracy. The model was developed using a swarm intelligence algorithm called the Ant Colony Optimization algorithm. The data for training was collected by a study conducted in Universiti Teknologi MARA (Perlis). Experimental results show that the model detection accuracy rate is better as compared to J48 algorithm.

  14. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  15. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    Science.gov (United States)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.

  16. Seamless Method- and Model-based Software and Systems Engineering

    Science.gov (United States)

    Broy, Manfred

    Today engineering software intensive systems is still more or less handicraft or at most at the level of manufacturing. Many steps are done ad-hoc and not in a fully systematic way. Applied methods, if any, are not scientifically justified, not justified by empirical data and as a result carrying out large software projects still is an adventure. However, there is no reason why the development of software intensive systems cannot be done in the future with the same precision and scientific rigor as in established engineering disciplines. To do that, however, a number of scientific and engineering challenges have to be mastered. The first one aims at a deep understanding of the essentials of carrying out such projects, which includes appropriate models and effective management methods. What is needed is a portfolio of models and methods coming together with a comprehensive support by tools as well as deep insights into the obstacles of developing software intensive systems and a portfolio of established and proven techniques and methods with clear profiles and rules that indicate when which method is ready for application. In the following we argue that there is scientific evidence and enough research results so far to be confident that solid engineering of software intensive systems can be achieved in the future. However, yet quite a number of scientific research problems have to be solved.

  17. Freshwater Snails Of Niger-Cem, Nkalagu Eastern Nigeria ...

    African Journals Online (AJOL)

    The results of snail collections carried out in the freshwater habitats of Niger-Cem in Nkalagu from August to November 2002 are reported. Also repored are findings on abundance, diversity and age structure of the snails. A total of 3491 pulmonate snails were collected, belonging to 3 families: Planorbidae (3133); ...

  18. A Business Maturity Model of Software Product Line Engineering

    OpenAIRE

    Ahmed, Faheem; Capretz, Luiz Fernando

    2015-01-01

    In the recent past, software product line engineering has become one of the most promising practices in software industry with the potential to substantially increase the software development productivity. Software product line engineering approach spans the dimensions of business, architecture, software engineering process and organization. The increasing popularity of software product line engineering in the software industry necessitates a process maturity evaluation methodology. According...

  19. Advanced quality prediction model for software architectural knowledge sharing

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris; Tang, Antony; Xu, Lai

    In the field of software architecture, a paradigm shift is occurring from describing the outcome of architecting process to describing the Architectural Knowledge (AK) created and used during architecting. Many AK models have been defined to represent domain concepts and their relationships, and

  20. Advances in Games Technology: Software, Models, and Intelligence

    Science.gov (United States)

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  1. Software-engineering-based model for mitigating Repetitive Strain ...

    African Journals Online (AJOL)

    The incorporation of Information and Communication Technology (ICT) in virtually all facets of human endeavours has fostered the use of computers. This has induced Repetitive Stress Injury (RSI) for continuous and persistent computer users. Proposing a software engineering model capable of enacted RSI force break ...

  2. Software Design Modelling with Functional Petri Nets | Bakpo ...

    African Journals Online (AJOL)

    In this paper, an equivalent functional Petri Net (FPN) model is developed for each of the three constructs of structured programs and a FPN Software prototype proposed for the conventional programming construct: if-then-else statement. The motivating idea is essentially to show that FPNs could be used as an alternative ...

  3. An Evaluation of ADLs on Modeling Patterns for Software Architecture

    NARCIS (Netherlands)

    Waqas Kamal, Ahmad; Avgeriou, Paris

    2007-01-01

    Architecture patterns provide solutions to recurring design problems at the architecture level. In order to model patterns during software architecture design, one may use a number of existing Architecture Description Languages (ADLs), including the UML, a generic language but also a de facto

  4. A CASE STUDY OF MODELING A TORUS IN DIFFERENT MODELING SOFTWARES

    Directory of Open Access Journals (Sweden)

    VELJKOVIĆ Milica

    2017-05-01

    Full Text Available Modeling of the complex geometric shapes requires the use of appropriate softwares. This study analyzes the process of modeling with two different computer softwares, AutoCAD and Rhinoceros. The aim is to demonstrate the similarities and differences between these softwares when used for modeling torus, a double curved geometric surface. The two modeling processes are compared in order to investigate the potentials of these softwares in the modeling of an architectural structure comprising a shell of the torus. After a detailed comparative analysis, the essential characteristics and shortcomings of these programs are emphasized and they were used to recommend the more appropriate one.

  5. Modeling the geographical studies with GeoGebra-software

    Directory of Open Access Journals (Sweden)

    Ionica Soare

    2010-01-01

    Full Text Available The problem of mathematical modeling in geography is one of the most important strategies in order to establish the evolution and the prevision of geographical phenomena. Models must have a simplified structure, to reflect essential components and must be selective, structured, and suggestive and approximate the reality. Models could be static or dynamic, developed in a theoretical, symbolic, conceptual or mental way, mathematically modeled. The present paper is focused on the virtual model which uses GeoGebra software, free and available at www.geogebra.org, in order to establish new methods of geographical analysis in a dynamic, didactic way.

  6. Modeling and managing risk early in software development

    Science.gov (United States)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  7. A software for parameter estimation in dynamic models

    Directory of Open Access Journals (Sweden)

    M. Yuceer

    2008-12-01

    Full Text Available A common problem in dynamic systems is to determine parameters in an equation used to represent experimental data. The goal is to determine the values of model parameters that provide the best fit to measured data, generally based on some type of least squares or maximum likelihood criterion. In the most general case, this requires the solution of a nonlinear and frequently non-convex optimization problem. Some of the available software lack in generality, while others do not provide ease of use. A user-interactive parameter estimation software was needed for identifying kinetic parameters. In this work we developed an integration based optimization approach to provide a solution to such problems. For easy implementation of the technique, a parameter estimation software (PARES has been developed in MATLAB environment. When tested with extensive example problems from literature, the suggested approach is proven to provide good agreement between predicted and observed data within relatively less computing time and iterations.

  8. Evolutionary neural network modeling for software cumulative failure time prediction

    International Nuclear Information System (INIS)

    Tian Liang; Noore, Afzel

    2005-01-01

    An evolutionary neural network modeling approach for software cumulative failure time prediction based on multiple-delayed-input single-output architecture is proposed. Genetic algorithm is used to globally optimize the number of the delayed input neurons and the number of neurons in the hidden layer of the neural network architecture. Modification of Levenberg-Marquardt algorithm with Bayesian regularization is used to improve the ability to predict software cumulative failure time. The performance of our proposed approach has been compared using real-time control and flight dynamic application data sets. Numerical results show that both the goodness-of-fit and the next-step-predictability of our proposed approach have greater accuracy in predicting software cumulative failure time compared to existing approaches

  9. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  10. File list: ALL.Bld.05.AllAg.CCRF-CEM [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available ALL.Bld.05.AllAg.CCRF-CEM hg19 All antigens Blood CCRF-CEM SRX728788,SRX728787,SRX7...X107286,SRX107295,SRX107305,SRX107290 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/ALL.Bld.05.AllAg.CCRF-CEM.bed ...

  11. File list: ALL.Bld.50.AllAg.CCRF-CEM [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available ALL.Bld.50.AllAg.CCRF-CEM hg19 All antigens Blood CCRF-CEM SRX728787,SRX728788,SRX1...X107295,SRX728791,SRX728785,SRX107296 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/ALL.Bld.50.AllAg.CCRF-CEM.bed ...

  12. File list: InP.Bld.05.AllAg.CCRF-CEM [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bld.05.AllAg.CCRF-CEM hg19 Input control Blood CCRF-CEM SRX728791,SRX107305,SRX...107290 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Bld.05.AllAg.CCRF-CEM.bed ...

  13. File list: InP.Bld.10.AllAg.CCRF-CEM [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bld.10.AllAg.CCRF-CEM hg19 Input control Blood CCRF-CEM SRX728791,SRX107305,SRX...107290 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Bld.10.AllAg.CCRF-CEM.bed ...

  14. File list: InP.Bld.50.AllAg.CCRF-CEM [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bld.50.AllAg.CCRF-CEM hg19 Input control Blood CCRF-CEM SRX107305,SRX107290,SRX...728791 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Bld.50.AllAg.CCRF-CEM.bed ...

  15. File list: ALL.Bld.20.AllAg.CCRF-CEM [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available ALL.Bld.20.AllAg.CCRF-CEM hg19 All antigens Blood CCRF-CEM SRX728787,SRX728788,SRX7...X107296,SRX728791,SRX107290,SRX107295 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/ALL.Bld.20.AllAg.CCRF-CEM.bed ...

  16. File list: ALL.Bld.10.AllAg.CCRF-CEM [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available ALL.Bld.10.AllAg.CCRF-CEM hg19 All antigens Blood CCRF-CEM SRX728787,SRX728788,SRX7...X107286,SRX107305,SRX107295,SRX107290 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/ALL.Bld.10.AllAg.CCRF-CEM.bed ...

  17. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  18. Effective Team Support: From Modeling to Software Agents

    Science.gov (United States)

    Remington, Roger W. (Technical Monitor); John, Bonnie; Sycara, Katia

    2003-01-01

    The purpose of this research contract was to perform multidisciplinary research between CMU psychologists, computer scientists and engineers and NASA researchers to design a next generation collaborative system to support a team of human experts and intelligent agents. To achieve robust performance enhancement of such a system, we had proposed to perform task and cognitive modeling to thoroughly understand the impact technology makes on the organization and on key individual personnel. Guided by cognitively-inspired requirements, we would then develop software agents that support the human team in decision making, information filtering, information distribution and integration to enhance team situational awareness. During the period covered by this final report, we made substantial progress in modeling infrastructure and task infrastructure. Work is continuing under a different contract to complete empirical data collection, cognitive modeling, and the building of software agents to support the teams task.

  19. ESPC Common Model Architecture Earth System Modeling Framework (ESMF) Software and Application Development

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. ESPC Common Model Architecture Earth System Modeling...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The...model architecture and other software-related standards in this project. OBJECTIVES NUOPC proposes to accelerate improvement of our national

  20. Osseous reaction to implantation of two endodontic cements: Mineral trioxide aggregate (MTA) and calcium enriched mixture (CEM)

    Science.gov (United States)

    Rahimi, Saeed; Shahi, Shahriar; Kazemi, Ali; Asgary, Saeed; Eghbal, Mohammad J.; Mesgariabbasi, Mehran; Mohajeri, Daryoush

    2012-01-01

    Aim: The aim of the present in vivo study was to determine bone tissue reaction to calcium enriched mixture (CEM) and mineral trioxide aggregate (MTA) using a rat femur model. Study Design: Sixty-three rats were selected and randomly divided into three groups of 21 each [experimental groups (n=15), control (n=6)]. Implantation cavities were prepared in each femoral bone and randomly filled with the biomaterials only in the experimental groups. The animals in three groups were sacrificed 1, 4, and 8 weeks postoperatively. Histologic evaluations comprising inflammation severity and new bone formation were blindly made on H&E-stained decalcified 6-µm sections. Results: At 1, 4, and 8 weeks after implantation number of inflammatory cells had decreased in the CEM, MTA and control groups, respectively, with no statistically significant differences. Conversely, new bone formation had increased in all the experimental and control groups, without statistically significant differences. Conclusion: The results suggest that biocompatibility of MTA, as gold standard, and CEM cement as a new endodontic biomaterial are comparable Key words:Endodontics, MTA,CEM, osseous reaction. PMID:22549692

  1. The Relationship of Personality Models and Development Tasks in Software Engineering

    OpenAIRE

    Wiesche, Manuel;Krcmar, Helmut

    2015-01-01

    Understanding the personality of software developers has been an ongoing topic in software engineering research. Software engineering researchers applied different theoretical models to understand software developers? personalities to better predict software developers? performance, orchestrate more effective and motivated teams, and identify the person that fits a certain job best. However, empirical results were found as contradicting, challenging validity, and missing guidance for IT perso...

  2. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  3. Usage models in reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Pulkkinen, U.; Korhonen, J.

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.)

  4. Duplication Detection When Evolving Feature Models of Software Product Lines

    Directory of Open Access Journals (Sweden)

    Amal Khtira

    2015-10-01

    Full Text Available After the derivation of specific applications from a software product line, the applications keep evolving with respect to new customer’s requirements. In general, evolutions in most industrial projects are expressed using natural language, because it is the easiest and the most flexible way for customers to express their needs. However, the use of this means of communication has shown its limits in detecting defects, such as inconsistency and duplication, when evolving the existing models of the software product line. The aim of this paper is to transform the natural language specifications of new evolutions into a more formal representation using natural language processing. Then, an algorithm is proposed to automatically detect duplication between these specifications and the existing product line feature models. In order to instantiate the proposed solution, a tool is developed to automatize the two operations.

  5. Software sensors based on the grey-box modelling approach

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.; Strube, Rune

    1996-01-01

    In recent years the grey-box modelling approach has been applied to wastewater transportation and treatment Grey-box models are characterized by the combination of deterministic and stochastic terms to form a model where all the parameters are statistically identifiable from the on......-line measurements. With respect to the development of software sensors, the grey-box models possess two important features. Firstly, the on-line measurements can be filtered according to the grey-box model in order to remove noise deriving from the measuring equipment and controlling devices. Secondly, the grey......-box models may contain terms which can be estimated on-line by use of the models and measurements. In this paper, it is demonstrated that many storage basins in sewer systems can be used as an on-line flow measurement provided that the basin is monitored on-line with a level transmitter and that a grey...

  6. Evaluating and Mitigating the Impact of Complexity in Software Models

    Science.gov (United States)

    2015-12-01

    use) 18. Year 2000 Repairs (date format expansion or masking) 19. Euro - currency conversion (adding the new unified currency to financial applications...from SCADE systems [ANSYS 2014a]. The following sections introduce the key concepts related to model complexity and pre- sent the outline of this...found during design = $500 Defects found during coding and testing = $1,250 Defects found after release = $5,000 CMU/SEI-2015-TR-013 | SOFTWARE

  7. HardCem : an innovative product and partnership

    Energy Technology Data Exchange (ETDEWEB)

    Joudrie, C. [Teck Cominco, Vancouver, BC (Canada)

    2007-07-01

    This paper described the multiple uses of Hard-Cem{sup TM}, a concrete hardener developed for ready-mix and pre-cast concrete applications. The product is engineered to improve the durability of concrete for air and non-air entrained construction projects including buildings, roads, bridges, dams and recreational facilities such as skate parks. The development history of Hard-Cem was reviewed along with its market introduction by Teck Cominco Limited. Technical and operating partnerships were also outlined along with future marketing opportunities. The concrete additive is engineered to increase abrasion resistance. It is added to the concrete during the batching and mixing operations where it is evenly dispersed through the concrete matrix with other proprietary ingredients. The recommended dosages were described along with performance data. The product was shown to save time and money while offering more resistance to mechanical and water borne abrasion forces in both interior and exterior concrete applications. tabs., figs.

  8. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  9. Software for biokinetic modeling of the radiopharmaceuticals used in PET

    International Nuclear Information System (INIS)

    Cordeiro, Leanderson P.; Vieira, Igor F.; Lima, Fernando R.A. de; Vieira, Jose W.

    2013-01-01

    In this work will be presented the current state of software in development to estimate the dose from PET images. Will be given the main biokinetic models used in PET, as well as the general features of a tool in development, whose current features allow quantitative analysis of compartmental models. Further, the tool allows display images 2D PET (in DICOM format) and quantify the intensity map of regions of interest in counts per second coincidence events. The next step is to insert in the same tool to estimate the activity concentration for ROI and estimate dose from PET images static and / or dynamic

  10. Estimating the Parameters of Software Reliability Growth Models Using the Grey Wolf Optimization Algorithm

    OpenAIRE

    Alaa F. Sheta; Amal Abdel-Raouf

    2016-01-01

    In this age of technology, building quality software is essential to competing in the business market. One of the major principles required for any quality and business software product for value fulfillment is reliability. Estimating software reliability early during the software development life cycle saves time and money as it prevents spending larger sums fixing a defective software product after deployment. The Software Reliability Growth Model (SRGM) can be used to predict the number of...

  11. Development of virtual hands using animation software and graphical modelling

    International Nuclear Information System (INIS)

    Oliveira, Erick da S.; Junior, Alberico B. de C.

    2016-01-01

    The numerical dosimetry uses virtual anthropomorphic simulators to represent the human being in computational framework and thus assess the risks associated with exposure to a radioactive source. With the development of computer animation software, the development of these simulators was facilitated using only knowledge of human anatomy to prepare various types of simulators (man, woman, child and baby) in various positions (sitting, standing, running) or part thereof (head, trunk and limbs). These simulators are constructed by loops of handling and due to the versatility of the method, one can create various geometries irradiation was not possible before. In this work, we have built an exhibition of a radiopharmaceutical scenario manipulating radioactive material using animation software and graphical modeling and anatomical database. (author)

  12. Opensource Software for MLR-Modelling of Solar Collectors

    DEFF Research Database (Denmark)

    Bacher, Peder; Perers, Bengt

    2011-01-01

    source program R http://www.r-project.org/. Applications of the software package includes: visual validation, resampling and conversion of data, collector performance testing analysis according to the European Standard EN 12975 (Fischer et al., 2004), statistical validation of results...... area flat plate collector with selective absorber and teflon anti convection layer. The package is intended to enable fast and reliable validation of data, and provide a united implementation for MLR testing of solar collectors. This will furthermore make it simple to replicate the calculations......A first research version is now in operation of a software package for multiple linear regression (MLR) modeling and analysis of solar collectors according to ideas originating all the way from Walletun et. al. (1986), Perers, (1987 and 1993). The tool has been implemented in the free and open...

  13. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  14. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  15. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  16. MEGACELL: A nanocrystal model construction software for HRTEM multislice simulation

    International Nuclear Information System (INIS)

    Stroppa, Daniel G.; Righetto, Ricardo D.; Montoro, Luciano A.; Ramirez, Antonio J.

    2011-01-01

    Image simulation has an invaluable importance for the accurate analysis of High Resolution Transmission Electron Microscope (HRTEM) results, especially due to its non-linear image formation mechanism. Because the as-obtained images cannot be interpreted in a straightforward fashion, the retrieval of both qualitative and quantitative information from HRTEM micrographs requires an iterative process including the simulation of a nanocrystal model and its comparison with experimental images. However most of the available image simulation software requires atom-by-atom coordinates as input for the calculations, which can be prohibitive for large finite crystals and/or low-symmetry systems and zone axis orientations. This paper presents an open source citation-ware tool named MEGACELL, which was developed to assist on the construction of nanocrystals models. It allows the user to build nanocrystals with virtually any convex polyhedral geometry and to retrieve its atomic positions either as a plain text file or as an output compatible with EMS (Electron Microscopy Software) input protocol. In addition to the description of this tool features, some construction examples and its application for scientific studies are presented. These studies show MEGACELL as a handy tool, which allows an easier construction of complex nanocrystal models and improves the quantitative information extraction from HRTEM images. -- Highlights: → A software to support the HRTEM image simulation of nanocrystals in actual size. → MEGACELL allows the construction of complex nanocrystals models for multislice image simulation. → Some examples of improved nanocrystalline system characterization are presented, including the analysis of 3D morphology and growth behavior.

  17. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    Science.gov (United States)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  18. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    ) that deals with this problem. We present the main concepts and rationales behind this notation and discuss a prototype and run-time environment that executes these models, and provides an API so that other parts of the software can be easily integrated. The core concepts of the ECNO seem to be stabilizing...... now, and the prototypic implementation of ECNO and its runtime environment show that the concepts of ECNO work. Still, there are some design issues and open questions that we discuss in this paper....

  19. Analysis of Schedule Determination in Software Program Development and Software Development Estimation Models

    Science.gov (United States)

    1988-09-01

    Performed . . . 52 Software Development Standards . . . . . 53 Use of Management Principles . . . . . . 54 Software Programaer Ability...Manager Program Flow and Test Case Analyzer File Manager 107 Mitre Prolect Data (page 4 of 18) Project *24 ( ABC ,D) 24 A B C D Description of Factors

  20. Model-based testing of software product lines

    OpenAIRE

    Metzger, Andreas; Pohl, Klaus; Reis, Sacha; Reuys, A

    2006-01-01

    peer-reviewed Due to the rising demand for individualised software products and software-intensive systems (e.g.,mobile phone or automotive software), organizations are faced with the challenge to provide a diversity of software systems at low costs, in short time, and with high quality. Software product line engineering is the approach for tackling this challenge and has proven its effectiveness in numerous industrial success stories, including Siemens, ABB, Boeing, Hewlett-Packard, Phili...

  1. An architectural model for software testing lesson learned systems

    OpenAIRE

    Pazos Sierra, Juan; Andrade, Javier; Ares Casal, Juan M.; Martínez Rey, María Aurora; Rodríguez, Santiago; Romera, Julio; Suárez, Sonia

    2013-01-01

    Software testing is a key aspect of software reliability and quality assurance in a context where software development constantly has to overcome mammoth challenges in a continuously changing environment. One of the characteristics of software testing is that it has a large intellectual capital component and can thus benefit from the use of the experience gained from past projects. Software testing can, then, potentially benefit from solutions provided by the knowledge management discipline. ...

  2. IMPROVED SOFTWARE QUALITY ASSURANCE TECHNIQUES USING SAFE GROWTH MODEL

    OpenAIRE

    M.Sangeetha; K.M.SenthilKumar; Dr.C.Arumugam; K. Akila

    2010-01-01

    In our lives are governed by large, complex systems with increasingly complex software, and the safety, security, and reliability of these systems has become a major concern. As the software in today’ssystems grows larger, it has more defects, and these defects adversely affect the safety, security, and reliability of the systems. Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, andmaintenance of software. Software divi...

  3. Methods to model-check parallel systems software

    International Nuclear Information System (INIS)

    Matlin, O. S.; McCune, W.; Lusk, E.

    2003-01-01

    We report on an effort to develop methodologies for formal verification of parts of the Multi-Purpose Daemon (MPD) parallel process management system. MPD is a distributed collection of communicating processes. While the individual components of the collection execute simple algorithms, their interaction leads to unexpected errors that are difficult to uncover by conventional means. Two verification approaches are discussed here: the standard model checking approach using the software model checker SPIN and the nonstandard use of a general-purpose first-order resolution-style theorem prover OTTER to conduct the traditional state space exploration. We compare modeling methodology and analyze performance and scalability of the two methods with respect to verification of MPD

  4. A novel breast software phantom for biomechanical modeling of elastography.

    Science.gov (United States)

    Bhatti, Syeda Naema; Sridhar-Keralapura, Mallika

    2012-04-01

    In developing breast imaging technologies, testing is done with phantoms. Physical phantoms are normally used but their size, shape, composition, and detail cannot be modified readily. These difficulties can be avoided by creating a software breast phantom. Researchers have created software breast phantoms using geometric and/or mathematical methods for applications like image fusion. The authors report a 3D software breast phantom that was built using a mechanical design tool, to investigate the biomechanics of elastography using finite element modeling (FEM). The authors propose this phantom as an intermediate assessment tool for elastography simulation; for use after testing with commonly used phantoms and before clinical testing. The authors design the phantom to be flexible in both, the breast geometry and biomechanical parameters, to make it a useful tool for elastography simulation. The authors develop the 3D software phantom using a mechanical design tool based on illustrations of normal breast anatomy. The software phantom does not use geometric primitives or imaging data. The authors discuss how to create this phantom and how to modify it. The authors demonstrate a typical elastography experiment of applying a static stress to the top surface of the breast just above a simulated tumor and calculate normal strains in 3D and in 2D with plane strain approximations with linear solvers. In particular, they investigate contrast transfer efficiency (CTE) by designing a parametric study based on location, shape, and stiffness of simulated tumors. The authors also compare their findings to a commonly used elastography phantom. The 3D breast software phantom is flexible in shape, size, and location of tumors, glandular to fatty content, and the ductal structure. Residual modulus, maps, and profiles, served as a guide to optimize meshing of this geometrically nonlinear phantom for biomechanical modeling of elastography. At best, low residues (around 1-5 KPa) were

  5. BioSTAR, a New Biomass and Yield Modeling Software

    Science.gov (United States)

    Kappas, M.; Degener, J.; Bauboeck, R.

    2013-12-01

    BioSTAR (Biomass Simulation Tool for Agricultural Recourses) is a new crop model which has been developed at the University of Göttingen for the assessment of agricultural biomass potentials in Lower Saxony, Germany. Lower Saxony is a major agricultural producer in Germany and in the EU, and biogas facilities which either use agricultural crops or manure or both have seen a strong boom in the last decade. To be able to model the potentials of these agricultural bioenergy crops was the objective of developing the BioSTAR model. BioSTAR is kept simple enough to be usable even for non-scientific users, e.g. staff in planning offices or farmers. The software of the model is written in Java and uses a Microsoft Access database connection to read its input data and write its output data. In this sense the software architecture is something entirely new as far as existing crop models are concerned. The database connection enables very fast editing of the various data sources which are needed to run a crop simulation and fosters the organization of this data. Due to the software setup, the amount of individual sites which can be processed with a few clicks is only limited by the maximum size of an Access database (2 GB) and thus allows datasets of 105 sites or more to be stored and processed. Data can easily be copied or imported from Excel. Capabilities of the crop model are: simulation of single or multiple year crop growth with total biomass production, evapotranspiration, soil water budget of a 16 layered soil profile and, nitrogen budget. The original growth engine of the model was carbon based (Azam-Ali, et al., 1994), but a radiation use efficiency and two transpiration based growth engines were added at a later point. Before each simulation run, the user can choose between these four growth engines and four different ET0-methods, or use an ensemble of them. Up to date (07/2013), the model has been calibrated for several winter and spring cereals, canola, maize

  6. Mathematical model and software for control of commissioning blast furnace

    Science.gov (United States)

    Spirin, N. A.; Onorin, O. P.; Shchipanov, K. A.; Lavrov, V. V.

    2016-09-01

    Blowing-in is a starting period of blast furnace operation after construction or major repair. The current approximation methods of blowing-in burden analysis are based on blowing-in practice of previously commissioned blast furnaces. This area is theoretically underexplored; there are no common scientifically based methods for selection of the burden composition and blast parameters. The purpose of this paper is development and scientific substantiation of the methods for selection of the burden composition and blast parameters in the blast furnace during the blowing-in period. Research methods are based on physical regularities of main processes running in the blast furnace, system analysis, and application of modern principles for development and construction of mathematical models, algorithms and software designed for automated control of complex production processes in metallurgy. As consequence of the research made by the authors the following results have been achieved: 1. A set of mathematical models for analysis of burden arrangement throughout the height of the blast furnace and for selection of optimal blast and gas dynamic parameters has been developed. 2. General principles for selection of the blowing-in burden composition and blast and gas dynamic parameters have been set up. 3. The software for the engineering and process staff of the blast furnace has been developed and introduced in the industry.

  7. Software enhancements to the IVSEM model of the CTBTO IMS.

    Energy Technology Data Exchange (ETDEWEB)

    Damico, Joseph P.

    2011-03-01

    Sandia National Laboratories (SNL) developed the Integrated Verification System Evaluation Model (IVSEM) to estimate the performance of the International Monitoring System (IMS) operated by the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). IVSEM was developed in several phases between 1995 and 2000. The model was developed in FORTRAN with an IDL-based user interface and was compiled for Windows and UNIX operating systems. Continuing interest in this analysis capability, coupled with numerous advances in desktop computer hardware and software since IVSEM was written, enabled significant improvements to IVSEM run-time performance and data analysis capabilities. These improvements were implemented externally without modifying the FORTRAN executables, which had been previously verified. This paper describes the parallelization approach developed to significantly reduce IVSEM run-times and the new test setup and analysis tools developed to facilitate better IVSEM operation.

  8. A SUPPLIER SELECTION MODEL FOR SOFTWARE DEVELOPMENT OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Hancu Lucian-Viorel

    2010-12-01

    Full Text Available This paper presents a multi-criteria decision making model used for supplier selection for software development outsourcing on e-marketplaces. This model can be used in auctions. The supplier selection process becomes complex and difficult on last twenty years since the Internet plays an important role in business management. Companies have to concentrate their efforts on their core activities and the others activities should be realized by outsourcing. They can achieve significant cost reduction by using e-marketplaces in their purchase process and by using decision support systems on supplier selection. In the literature were proposed many approaches for supplier evaluation and selection process. The performance of potential suppliers is evaluated using multi criteria decision making methods rather than considering a single factor cost.

  9. Evolution of the 'Trick' Dynamic Software Executive and Model Libraries for Reusable Flight Software, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to a need for cost-effective small satellite missions, Odyssey Space Research is proposing the development of a common flight software executive and a...

  10. Evolution of the 'Trick' Dynamic Software Executive and Model Libraries for Reusable Flight Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to a need for cost-effective small satellite missions, Odyssey Space Research is proposing the development of a common flight software executive and a...

  11. An investigation of modelling and design for software service applications.

    Science.gov (United States)

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  12. Increasing the reliability of ecological models using modern software engineering techniques

    Science.gov (United States)

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  13. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    Energy Technology Data Exchange (ETDEWEB)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth (Oak Ridge National Laboratory, Oak Ridge, TN); Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

  14. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  15. Software development infrastructure for the HYBRID modeling and simulation project

    Energy Technology Data Exchange (ETDEWEB)

    Epiney, Aaron S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Greenwood, M. Scott [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  16. Exponentiated Weibull distribution approach based inflection S-shaped software reliability growth model

    Directory of Open Access Journals (Sweden)

    B.B. Sagar

    2016-09-01

    Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.

  17. A Classification Methodology and Retrieval Model to Support Software Reuse

    Science.gov (United States)

    1988-01-01

    by arabic numerals as illustrated below. L Medicine 2 Digestive system 27 Large intestine 27219 Vermiform appendix Common subdivisions are marked by...Construction .................................. 138 6.3.2.5 Retrieval Based Upon Reuse Metrics .............. 138 REFERENCES_ _ 140 APPENDIX 1 - SOFTWARE...COMPONENT SCHEDULE 1 .151 APPENDIX 2 - SOFTWARE COMPONENT SCHEDULE 2 .155 VITA--______.159__ ’S -- F !1I W! x LIST OF FIGURES Figure Page 1. Software

  18. Software development for modeling positrons emission tomograph scanners

    International Nuclear Information System (INIS)

    Vieira, Igor Fagner

    2013-01-01

    The Geant4 Application for Tomographic Emission (GATE) is an international platform recognized and used to develop Computational Model Exposure (CME) in the context of Nuclear Medicine, although currently there are dedicated modules for applications in Radiotherapy and Computed Tomography (CT). GATE uses Monte Carlo (MC) methods, and has a scripting language of its own. The writing of scripts for simulation of a PET scanner in GATE involves a number of interrelated steps, and the accuracy of the simulation is dependent on the correct setup of the geometries involved, since the physical processes depend on them, as well as the modeling of electronic detectors in module Digitizer, for example. The manual implementation of this setup can be a source of errors, especially for users without experience in the field of simulations or without any previous knowledge of a programming language, and also due to the the fact that the modeling process in GATE still remains bounded to LINUX / UNIX based systems, an environment only familiar to a few. This becomes an obstacle for beginners and prevents the use of GATE by a larger number of users interested in optimizing their experiments and/or clinical protocols through a more accessible, fast and friendly application. The objective of this work is therefore to develop a user-friendly software for the modeling of Positron Emission Tomography called GUIGATE (Graphical User Interface for GATE), with specific modules dedicated to quality control in PET scanners. The results exhibit the features available in this first version of GUIGATE, present in a set of windows that allow users to create their input files, perform and display in real time the model and analyze its output file in a single environment, allowing so intuitively access the entire architecture of the GATE simulation and to CERN's data analyzer, the ROOT. (author)

  19. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    Science.gov (United States)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  20. 40 CFR 75.38 - Standard missing data procedures for Hg CEMS.

    Science.gov (United States)

    2010-07-01

    ... Hg CEMS. 75.38 Section 75.38 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Standard missing data procedures for Hg CEMS. (a) Once 720 quality assured monitor operating hours of Hg... substitute data for Hg concentration in accordance with the procedures in ( 75.33(b)(1) through (b)(4...

  1. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  2. Model-driven design of simulation support for the TERRA robot software tool suite

    NARCIS (Netherlands)

    Lu, Zhou; Bezemer, M.M.; Broenink, Johannes F.

    2015-01-01

    Model-Driven Development (MDD) – based on the concepts of model, meta-model and model transformation – is an approach to develop predictable and re- liable software for Cyber-Physical Systems (CPS). The work presented here concerns a methodology to design simulation software based on MDD techniques,

  3. Way of Working for Embedded Control Software using Model-Driven Development Techniques

    NARCIS (Netherlands)

    Bezemer, M.M.; Groothuis, M.A.; Brugali, D.; Schlegel, C.; Broenink, Johannes F.

    2011-01-01

    Embedded targets normally do not have much resources to aid developing and debugging the software. So model-driven development (MDD) is used for designing embedded software with a `first time right' approach. For such an approach, a good way of working (WoW) is required for embedded software

  4. Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses

    Science.gov (United States)

    Mitra, Sandeep

    2014-01-01

    This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…

  5. Experimental Evaluation of a Serious Game for Teaching Software Process Modeling

    Science.gov (United States)

    Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz

    2015-01-01

    Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…

  6. Free Open Source Software: Social Phenomenon, New Management, New Business Models

    OpenAIRE

    Žilvinas Jančoras

    2011-01-01

    In the paper assumptions of free open source software existence, development, financing and competition models are presented. The free software as a social phenomenon and the open source software as the technological and managerial innovation environment are revealed. The social and business interaction processes are analyzed.Article in Lithuanian

  7. Supporting custom quality models to analyse and compare open-source software

    NARCIS (Netherlands)

    D. Di Ruscio (Davide); D.S. Kolovos (Dimitrios); I. Korkontzelos (Ioannis); N. Matragkas (Nicholas); J.J. Vinju (Jurgen)

    2017-01-01

    textabstractThe analysis and comparison of open source software can be improved by means of quality models supporting the evaluation of the software systems being compared and the final decision about which of them has to be adopted. Since software quality can mean different things in different

  8. A Model for Quality Optimization in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet

    The main objective of software engineers is to design and implement systems that implement all functional and non-functional requirements. Unfortunately, it is very difficult or even generally impossible to deliver a software system that satisfies all the requirements. Even more seriously, failures

  9. A multiobjective module-order model for software quality enhancement

    NARCIS (Netherlands)

    Khoshgoftaar, TM; Liu, Y; Seliya, N

    2004-01-01

    The knowledge, prior to system operations, of which program modules are problematic is valuable to a software quality assurance team, especially when there is a constraint on software quality enhancement resources. A cost-effective approach for allocating such resources is to obtain a prediction in

  10. Pragmatic quality metrics for evolutionary software development models

    Science.gov (United States)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  11. Software Security and the "Building Security in Maturity" Model

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Using the framework described in my book "Software Security: Building Security In" I will discuss and describe the state of the practice in software security. This talk is peppered with real data from the field, based on my work with several large companies as a Cigital consultant. As a discipline, software security has made great progress over the last decade. Of the sixty large-scale software security initiatives we are aware of, thirty-two---all household names---are currently included in the BSIMM study. Those companies among the thirty-two who graciously agreed to be identified include: Adobe, Aon, Bank of America, Capital One, The Depository Trust & Clearing Corporation (DTCC), EMC, Google, Intel, Intuit, McKesson, Microsoft, Nokia, QUALCOMM, Sallie Mae, Standard Life, SWIFT, Symantec, Telecom Italia, Thomson Reuters, VMware, and Wells Fargo. The BSIMM was created by observing and analyzing real-world data from thirty-two leading software security initiatives. The BSIMM can...

  12. The self-aware diabetic patient software agent model.

    Science.gov (United States)

    Wang, Zhanle; Paranjape, Raman

    2013-11-01

    This work presents a self-aware diabetic patient software agent for representing a human diabetic patient. To develop a 24h, stochastic and self-aware patient agent, we extend the original seminal work of Ackerman et al. [1] in creating a mathematical model of human blood glucose levels in three aspects. (1) We incorporate the stochastic and unpredictable effects of daily living. (2) The Ackerman model is extended into the period of night-time. (3) Patients' awareness of their own conditions is incorporated. Simulation results are quantitatively assessed to demonstrate the effectiveness of lifestyle management, such as adjusting the amount of food consumed, meal schedule, intensity of exercise and level of medication. In this work we show through the simulation that the average blood glucose can be reduced by as much as 51% due to careful lifestyle management. Self monitoring blood glucose is also quantitatively evaluated. The simulation results show that the average blood glucose is further dropped by 25% with the assistance of blood glucose samples. In addition, the blood glucose is perfectly controlled in the target range during the simulation period as a result of joint efforts of lifestyle management and self monitoring blood glucose. This study focuses on demonstrating how human patients' behavior, specifically lifestyle and self monitoring of blood glucose, affects blood glucose controls on a daily basis. This work does not focus on the insulin-glucose interaction of an individual human patient. Our conclusion is that this self-aware patient agent model is capable of adequately representing diabetic patients and of evaluating their dynamic behaviors. It can also be incorporated into a multi-agent system by introducing other healthcare components so that more interesting insights such as the healthcare quality, cost and performance can be observed. © 2013 Published by Elsevier Ltd.

  13. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  14. Software for Estimating Costs of Testing Rocket Engines

    Science.gov (United States)

    Hines, Merlon M.

    2004-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  15. A dependability modeling of software under hardware faults digitized system in nuclear power plants

    International Nuclear Information System (INIS)

    Choi, Jong Gyun

    1996-02-01

    An analytic approach to the dependability evaluation of software in the operational phase is suggested in this work with special attention to the physical fault effects on the software dependability : The physical faults considered are memory faults and the dependability measure in question is the reliability. The model is based on the simple reliability theory and the graph theory with the path decomposition micro model. The model represents an application software with a graph consisting of nodes and arcs that probabilistic ally determine the flow from node to node. Through proper transformation of nodes and arcs, the graph can be reduced to a simple two-node graph and the software failure probability is derived from this graph. This model can be extended to the software system which consists of several complete modules without modification. The derived model is validated by the computer simulation, where the software is transformed to a probabilistic control flow graph. Simulation also shows a different viewpoint of software failure behavior. Using this model, we predict the reliability of an application software and a software system in a digitized system(ILS system) in the nuclear power plant and show the sensitivity of the software reliability to the major physical parameters which affect the software failure in the normal operation phase. The derived model is validated by the computer simulation, where the software is transformed to a probabilistic control flow graph. Simulation also shows a different viewpoint of software failure behavior. Using this model, we predict the reliability of an application software and a software system in a digitized system (ILS system) is the nuclear power plant and show the sensitivity of the software reliability to the major physical parameters which affect the software failure in the normal operation phase. This modeling method is particularly attractive for medium size programs such as software used in digitized systems of

  16. Possible Improvements to MCNP6 and its CEM/LAQGSM Event-Generators

    Energy Technology Data Exchange (ETDEWEB)

    Mashnik, Stepan Georgievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-04

    This report is intended to the MCNP6 developers and sponsors of MCNP6. It presents a set of suggested possible future improvements to MCNP6 and to its CEM03.03 and LAQGSM03.03 event-generators. A few suggested modifications of MCNP6 are quite simple, aimed at avoiding possible problems with running MCNP6 on various computers, i.e., these changes are not expected to change or improve any results, but should make the use of MCNP6 easier; such changes are expected to require limited man-power resources. On the other hand, several other suggested improvements require a serious further development of nuclear reaction models, are expected to improve significantly the predictive power of MCNP6 for a number of nuclear reactions; but, such developments require several years of work by real experts on nuclear reactions.

  17. A communication channel model of the software process

    Science.gov (United States)

    Tausworthe, Robert C.

    1988-01-01

    Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.

  18. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    Science.gov (United States)

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  19. Methods of Modelling Marketing Activity on Software Sales

    Directory of Open Access Journals (Sweden)

    Bashirov Islam H.

    2013-11-01

    Full Text Available The article studies a topical issue of development of methods of modelling marketing activity on software sales for achievement of efficient functioning of an enterprise. On the basis of analysis of the market type for the studied CloudLinux OS product, the article identifies the market structure type: monopolistic competition. To ensure the information basis of the marketing activity in the target market segment, the article offers the survey method. The article provides a questionnaire, which contains specific questions regarding the studied market segment of hosting services, for an online survey with the help of the Survio service. In accordance with the system approach the CloudLinux OS has properties of systems, namely, diversity. Economic differences are non-price indicators that have no numeric expression and are quality descriptions. Analysis of the market and the conducted survey allow obtaining them. Combination of price and non-price indicators provides a complete description of the product properties. To calculate an integral indicator of competitiveness the article offers to apply a model, which is based on the direct algebraic addition of weight measures of individual indicators, regulation of formalised indicators and use of the mechanism of fuzzy sets for identification of non-formalised indicators. The calculated indicator allows not only assessment of the current level of competitiveness, but also identification of influence of changes of various indicators, which allows increase of efficiency of marketing decisions. Also, having identified the target customers of hosting OS and formalised non-price parameters, it is possible to conduct the search for a set of optimal characteristics of the product. In the result an optimal strategy of the product advancement to the market is formed.

  20. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    Science.gov (United States)

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  1. System Engineering Software Assessment Model for Exploration (SESAME) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Concept phase space-systems architecture evaluations typically use mass estimates as the primary means of ranking potential mission architectures. Software does not...

  2. Construction of a recombinant plasmid as reaction control in routine PCR for detection of contagious equine metritis (CEM-PCR).

    Science.gov (United States)

    Niwa, Hidekazu; Anzai, Toru; Hobo, Seiji

    2007-11-01

    Contagious equine metritis (CEM) is a highly contagious bacterial venereal disease of horses caused by Taylorella equigenitalis. CEM-PCR is a semi-nested PCR method for detecting this bacterium. Although this technique is regarded as a sensitive diagnostic method for CEM, there are risks of it generating false positive and false negative results. In this study, we constructed a recombinant plasmid (CEM-POS) as reaction control to assure adequate PCR reaction and prevent false positive results caused by contamination of the reaction control in routine CEM-PCR examinations. CEM-POS was constructed by insertion of rpoB fragments from Rhodococcus equi into CEM-1P, which is a recombinant plasmid that includes a T. equigenitalis-specific sequence region. In CEM-PCR, the size of the PCR product from CEM-POS was clearly different from the true positive PCR product. In addition, CEM-POS retained high stability under convenient storage conditions of 4 degrees C. These results suggest CEM-POS to be a useful tool as a reaction control in routine CEM-PCR examinations.

  3. Optimization of Component Based Software Engineering Model Using Neural Network

    OpenAIRE

    Gaurav Kumar; Pradeep Kumar Bhatia

    2014-01-01

    The goal of Component Based Software Engineering (CBSE) is to deliver high quality, more reliable and more maintainable software systems in a shorter time and within limited budget by reusing and combining existing quality components. A high quality system can be achieved by using quality components, framework and integration process that plays a significant role. So, techniques and methods used for quality assurance and assessment of a component based system is different from those of the tr...

  4. Bringing Model Checking Closer to Practical Software Engineering

    OpenAIRE

    Remenska, Daniela; Bal, H E; Templon, J A; Willemse, T.A.C.

    2016-01-01

    Software grows in size and complexity, making it increasingly challenging to ensure that it behaves correctly. This is especially true for distributed systems, where a multitude of components are running concurrently, making it dicult to anticipate all the possible behaviors emerging in the system as a whole. Certain design errors, such as deadlocks and race-conditions, can often go unnoticed when testing is the only form of verication employed in the software engineering life-cycle. Even whe...

  5. A PROPOSED MODEL OF AGILE METHODOLOGY IN SOFTWARE DEVELOPMENT

    OpenAIRE

    Anjali Sharma*, Karambir

    2016-01-01

    Agile Software development has been increasing popularity and replacing the traditional methods of software develop-ment. This paper presents the all neural network techniques including General Regression Neural Networks (GRNN), Prob-abilistic Neural Network (PNN), GMDH Polynomial Neural Network, Cascade correlation neural network and a Machine Learning Technique Random Forest. To achieve better prediction, effort estimation of agile projects we will use Random Forest with Story Points Approa...

  6. Man-Machine Interface Design for Modeling and Simulation Software

    Directory of Open Access Journals (Sweden)

    Arnstein J. Borstad

    1986-07-01

    Full Text Available Computer aided design (CAD systems, or more generally interactive software, are today being developed for various application areas like VLSI-design, mechanical structure design, avionics design, cartographic design, architectual design, office automation, publishing, etc. Such tools are becoming more and more important in order to be productive and to be able to design quality products. One important part of CAD-software development is the man-machine interface (MMI design.

  7. Concepts Evaluation Model V (CEM V) Part I. Technical Description,

    Science.gov (United States)

    1980-01-01

    the combat casualties, disease and nonbattle injuries (DNBI) are calculated for the engagement by use of an in- put fraction applied against the...P +)/AA R O (AAR - AARO) I1 - (1 - PSH)(1 - FRSA)(EFS)(P9)/SHE] where: AAR total aircraft at risk = (PIP-LAR- LCA )(FARP)+(IIP-121)(FARI) FARP and FARI...are at risk fractions for TF and ADF air- craft respectively LAR = air losses in AR/I role, this subperiod LCA = air losses in CA role, this subperiod

  8. On integrating modeling software for application to total-system performance assessment

    International Nuclear Information System (INIS)

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere

  9. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  10. REQUIREMENTS PARTICLE NETWORKS: AN APPROACH TO FORMAL SOFTWARE FUNCTIONAL REQUIREMENTS MODELLING

    OpenAIRE

    Wiwat Vatanawood; Wanchai Rivepiboon

    2001-01-01

    In this paper, an approach to software functional requirements modelling using requirements particle networks is presented. In our approach, a set of requirements particles is defined as an essential tool to construct a visual model of software functional requirements specification during the software analysis phase and the relevant formal specification is systematically generated without the experience of writing formal specification. A number of algorithms are presented to perform these for...

  11. Introduction to Lean Canvas Transformation Models and Metrics in Software Testing

    Directory of Open Access Journals (Sweden)

    Nidagundi Padmaraj

    2016-05-01

    Full Text Available Software plays a key role nowadays in all fields, from simple up to cutting-edge technologies and most of technology devices now work on software. Software development verification and validation have become very important to produce the high quality software according to business stakeholder requirements. Different software development methodologies have given a new dimension for software testing. In traditional waterfall software development software testing has approached the end point and begins with resource planning, a test plan is designed and test criteria are defined for acceptance testing. In this process most of test plan is well documented and it leads towards the time-consuming processes. For the modern software development methodology such as agile where long test processes and documentations are not followed strictly due to small iteration of software development and testing, lean canvas transformation models can be a solution. This paper provides a new dimension to find out the possibilities of adopting the lean transformation models and metrics in the software test plan to simplify the test process for further use of these test metrics on canvas.

  12. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting...... particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLAN) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLAN semantics based on discrete-time Markov chains....... The Maude implementation of PFLAN is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average...

  13. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    Directory of Open Access Journals (Sweden)

    Maurice H. ter Beek

    2015-04-01

    Full Text Available We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLan with action rates, which specify the likelihood of exhibiting particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLan allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLan semantics based on discrete-time Markov chains. The Maude implementation of PFLan is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning and the expected average cost of products.

  14. Bringing Model Checking Closer to Practical Software Engineering

    CERN Document Server

    AUTHOR|(CDS)2079681; Templon, J A; Willemse, T.A.C.

    Software grows in size and complexity, making it increasingly challenging to ensure that it behaves correctly. This is especially true for distributed systems, where a multitude of components are running concurrently, making it dicult to anticipate all the possible behaviors emerging in the system as a whole. Certain design errors, such as deadlocks and race-conditions, can often go unnoticed when testing is the only form of verication employed in the software engineering life-cycle. Even when bugs are detected in a running software, revealing the root cause and reproducing the behavior can be time consuming (and even impossible), given the lack of control the engineer has over the execution of the concurrent components, as well as the number of possible scenarios that could have produced the problem. This is especially pronounced for large-scale distributed systems such as the Worldwide Large Hadron Collider Computing Grid. Formal verication methods oer more rigorous means of determining whether a system sat...

  15. Basic Modelling principles and Validation of Software for Prediction of Collision Damage

    DEFF Research Database (Denmark)

    Simonsen, Bo Cerup

    2000-01-01

    This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software.......This report describes basic modelling principles, the theoretical background and validation examples for the collision damage prediction module in the ISESO stand-alone software....

  16. Customizing Standard Software as a Business Model in the IT Industry

    DEFF Research Database (Denmark)

    Kautz, Karlheinz; Rab, Sameen M.; Sinnet, Michael

    2011-01-01

    an interpretive case study of a small software company which customizes a standard product. We investigate the company’s interactions with a large global software company which is the producer of the original software product and with other companies which are involved in the software customization process. We......This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide...... find that the customizing company and software customizations depend not only on initiatives which are set off internally in the company, but on how the ustomizing organization’s inter-organizational network and interaction with other organizations is built up. The case company has built its network...

  17. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  18. An improved COCOMO software cost estimation model | Duke ...

    African Journals Online (AJOL)

    adopting development efforts in three perspectives . optimistic, pessimistic and most-likely. We carry out the implementation with four different software sizes . 10,000, 15,000, 20,000 and 50,000 lines of codes (LOC). When compared, the estimated values of the optimistic efforts (Eopt) estimated with COCOMO II using the ...

  19. A multiphysics and multiscale software environment for modeling astrophysical systems

    NARCIS (Netherlands)

    Portegies Zwart, S.; McMillan, S.; Harfst, S.; Groen, D.; Fujii, M.; Ó Nualláin, B.; Glebbeek, E.; Heggie, D.; Lombardi, J.; Hut, P.; Angelou, V.; Banerjee, S.; Belkus, H.; Fragos, T.; Fregeau, J.; Gaburov, E.; Izzard, R.; Jurić, M.; Justham, S.; Sottoriva, A.; Teuben, P.; van Bever, J.; Yaron, O.; Zemp, M.

    2009-01-01

    We present MUSE, a software framework for combining existing computational tools for different astrophysical domains into a single multiphysics, multiscale application. MUSE facilitates the coupling of existing codes written in different languages by providing inter-language tools and by specifying

  20. Revenue Management and Demand Fulfillment: Matching Applications, Models, and Software

    NARCIS (Netherlands)

    R. Quante (Rainer); H. Meyr (Herbert); M. Fleischmann (Moritz)

    2007-01-01

    textabstractRecent years have seen great successes of revenue management, notably in the airline, hotel, and car rental business. Currently, an increasing number of industries, including manufacturers and retailers, are exploring ways to adopt similar concepts. Software companies are taking an

  1. Modeling Software Product Line Engineering with Essence Framework

    NARCIS (Netherlands)

    Tüzün, Eray; Giray, Görkem; Tekinerdogan, B.; Macit, Yagup

    2018-01-01

    Although several software product line engineering (SPLE) methods have been described in the literature, adopting these methods in practice is often not straightforward. Thorough understanding of the methods and their artefacts is necessary to apply the methods in a proper manner, and likewise

  2. Software Tools For Large Scale Interactive Hydrodynamic Modeling

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; van Dam, A; Jagers, B; van der Pijl, S.; Piasecki, M.

    2014-01-01

    Developing easy-to-use software that combines components for simultaneous visualization, simulation and interaction is a great challenge. Mainly, because it involves a number of disciplines, like computational fluid dynamics, computer graphics, high-performance computing. One of the main

  3. A theory and model for the evolution of software services

    NARCIS (Netherlands)

    Andrikopoulos, V.

    2010-01-01

    Software services are subject to constant change and variation. To control service development, a service developer needs to know why a change was made, what are its implications and whether the change is complete. Typically, service clients do not perceive the upgraded service immediately. As a

  4. A Survey of Software Reliability Modeling and Estimation

    Science.gov (United States)

    1983-09-01

    Reliability and Its Exorcism ," Proceedings of the Joint Automatic Control Conference, 1977, Published by the-IE, New York, 1977, pp. 225-231. 28...of Software Reliability and its Exorcism ," Proceedings of the Joint Automatic Control Conference, 1977, Published by the IEEE, New I.’ York, 1977

  5. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

    Directory of Open Access Journals (Sweden)

    Михаил Юрьевич Чернышов

    2013-12-01

    Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

  6. Embedding the concept of service oriented architecture into software sustainability evaluation model

    Science.gov (United States)

    Ahmad, Ruzita; Hussain, Azham; Baharom, Fauziah

    2017-10-01

    Software sustainability evaluation is a measurement mechanism which involved several criteria of software development through the characteristic and sub-characteristic with requirement to meet the needs at the present until to the future generation. The measurement mechanism can support to achieve developing software towards sustainability perspective such as environment, economic and social. This paper embedded the concept of Service-Oriented Architecture into sustainability evaluation model to support the measurement criteria in the way to build software flexibility, reusability and agility. The objective is to propose several characteristic of software development with utilizing the concept of sustainability and embedded with SOA concept. The mapping criteria of SOA and software development characteristic significantly improve the measurement criteria that can be addressed in the measurement model.

  7. 2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Carrington, David Bradley [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Waters, Jiajia [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-25

    Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.

  8. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    Science.gov (United States)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications

  9. Modeling of a 3DTV service in the software-defined networking architecture

    Science.gov (United States)

    Wilczewski, Grzegorz

    2014-11-01

    In this article a newly developed concept towards modeling of a multimedia service offering stereoscopic motion imagery is presented. Proposed model is based on the approach of utilization of Software-defined Networking or Software Defined Networks architecture (SDN). The definition of 3D television service spanning SDN concept is identified, exposing basic characteristic of a 3DTV service in a modern networking organization layout. Furthermore, exemplary functionalities of the proposed 3DTV model are depicted. It is indicated that modeling of a 3DTV service in the Software-defined Networking architecture leads to multiplicity of improvements, especially towards flexibility of a service supporting heterogeneity of end user devices.

  10. A Software Technology Transition Entropy Based Engineering Model

    Science.gov (United States)

    2002-03-01

    D., The Double Helix, A Personal Account of the Discovery of the Structure of DNA , Simon and Schuster, New York, 1968. (Wehrl 1978) Wehrl, Alfred...Basili, Victor R., and Musa , John D., “The Future Engineering of Software: A Management Perspective,” Computer, pp. 90-96, September 1991. (Basili...calculations required. The tables and data were extracted from a supporting document (Behnke 2001) which describes the calculations. This represents

  11. The Use of Modeling for Flight Software Engineering on SMAP

    Science.gov (United States)

    Murray, Alexander; Jones, Chris G.; Reder, Leonard; Cheng, Shang-Wen

    2011-01-01

    The Soil Moisture Active Passive (SMAP) mission proposes to deploy an Earth-orbiting satellite with the goal of obtaining global maps of soil moisture content at regular intervals. Launch is currently planned in 2014. The spacecraft bus would be built at the Jet Propulsion Laboratory (JPL), incorporating both new avionics as well as hardware and software heritage from other JPL projects. [4] provides a comprehensive overview of the proposed mission

  12. Software Development Cost and Time Forecasting Using a High Performance Artificial Neural Network Model

    Science.gov (United States)

    Attarzadeh, Iman; Ow, Siew Hock

    Nowadays, mature software companies are more interested to have a precise estimation of software metrics such as project time, cost, quality, and risk at the early stages of software development process. The ability to precisely estimate project time and costs by project managers is one of the essential tasks in software development activities, and it named software effort estimation. The estimated effort at the early stage of project development process is uncertain, vague, and often the least accurate. It is because that very little information is available at the beginning stage of project. Therefore, a reliable and precise effort estimation model is an ongoing challenge for project managers and software engineers. This research work proposes a novel soft computing model incorporating Constructive Cost Model (COCOMO) to improve the precision of software time and cost estimation. The proposed artificial neural network model has good generalisation, adaption capability, and it can be interpreted and validated by software engineers. The experimental results show that applying the desirable features of artificial neural networks on the algorithmic estimation model improves the accuracy of time and cost estimation and estimated effort can be very close to the actual effort.

  13. SOFTWARE FOR FAULT DIAGNOSIS USING KNOWLEDGE MODELS IN PETRI NETS

    Directory of Open Access Journals (Sweden)

    ADRIAN ARBOLEDA

    2012-01-01

    Full Text Available Los sistemas de diagnóstico de fallas en empresas asociadas al sector eléctrico requieren propiedades de precisión y flexibilidad cuando surgen eventos de falla. Actualmente existen sistemas que pretenden mejorar el proceso de diagnóstico mediante varios métodos y técnicas computacionales, reduciendo el tiempo de respuesta a perturbaciones. Sin embargo, son pocas las propuestas que unifican modelos gráficos de conocimiento con las señales de un proceso que pueden ofrecer dispositivos como controladores lógicos programables (PLCs. Este artículo propone un software novedoso guiado por modelos basados en redes de Petri e integrado con señales del proceso, para el diagnóstico de falla en centrales de generación eléctrica. Un caso de estudio demuestra la flexibilidad y adaptabilidad del software cuando nuevas nociones en los modelos de conocimiento cambian, sin realizar procedimientos de reingeniería al software.

  14. The Civitavecchia Coastal Environment Monitoring System (C-CEMS): an integrated approach to the study of coastal oceanographic processes

    Science.gov (United States)

    Marcelli, Marco; Piermattei, Viviana; Madonia, Alice; Bonamano, Simone; Martellucci, Riccardo; Pierattini, Alberto; Albani, Marta; Borsellino, Chiara; Zappalà, Giuseppe

    2015-04-01

    The study of the physical and biological processes of the coastal environment, characterized by high spatial and time variability, requires the adoption of multidisciplinary strategies of investigation that takes into account, not only the biotic and abiotic components of coastal marine ecosystems, but also the terrestrial, atmospheric and hydrological features linked to them. The understanding of coastal environment is fundamental to face efficiently and effectively the pollution phenomena, as expected by Marine Strategy (2008/56 EC) Directive, which is focused on the achievement of GES by 2020 in all Member States. Following these lines, the Laboratory of Experimental Oceanology and Marine Ecology (University of Tuscia) has developed a multi-platform observing network (the Civitavecchia Coastal Environment Monitoring System, C-CEMS) that operates since 2005 in the coastal marine area of Civitavecchia (northern Tyrrhenian Sea, Italy), where multiple uses (industrial, commercial and tourist activities) and high ecological values (Posidonia oceanica meadows, hard-bottom benthic communities, priority species, etc.) closely coexist. Furthermore, in the last years the Civitavecchia harbour, which is one of the main ports of Europe, has been subjected to a series of expansion works that could impact significantly on the coastal environment. The C-CEMS, implemented in the current configuration, is composed by five main modules (fixed stations, in-situ measurements and samplings, satellite observations, numerical models, GIS) which provide integrated informations to be used in different fields of the environmental research. The fixed stations system controls one weather, two water quality and two wave-buoy stations along the coast. In addition to the long term observations acquired by the fixed stations (L-TER), in situ surveys are periodically carried out for the monitoring of the physical, chemical and biological characteristics of the water column and marine sediments

  15. An Approach to Modeling Software Safety in Safety-Critical Systems

    OpenAIRE

    Ben S. Medikonda; Seetha R. Panchumarthy

    2009-01-01

    Software for safety-critical systems has to deal with the hazards identified by safety analysis in order to make the system safe, risk-free and fail-safe. Software safety is a composite of many factors. Problem statement: Existing software quality models like McCalls and Boehms and ISO 9126 were inadequate in addressing the software safety issues of real time safety-critical embedded systems. At present there does not exist any standard framework that comprehensively addresses the Factors, Cr...

  16. Specification and Generation of Environment for Model Checking of Software Components

    Czech Academy of Sciences Publication Activity Database

    Pařízek, P.; Plášil, František

    2007-01-01

    Roč. 176, - (2007), s. 143-154 ISSN 1571-0661 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  17. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Science.gov (United States)

    2011-05-18

    ... COMMISSION NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection... issued for public comment a document entitled: NUREG/CR-XXXX, ``Development of Quantitative Software... development of regulatory guidance for using risk information related to digital systems in the licensing...

  18. A Unified Component Modeling Approach for Performance Estimation in Hardware/Software Codesign

    DEFF Research Database (Denmark)

    Grode, Jesper Nicolai Riis; Madsen, Jan

    1998-01-01

    This paper presents an approach for abstract modeling of hardware/software architectures using Hierarchical Colored Petri Nets. The approach is able to capture complex behavioral characteristics often seen in software and hardware architectures, thus it is suitable for high level codesign issues ...... system [12]. Details and the basic characteristics of the approach can be found in [8]....

  19. Mechanisms for Leveraging Models at Runtime in Self-adaptive Software

    NARCIS (Netherlands)

    Bennaceur, Amel; France, Robert; Tamburelli, Giordano; Vogel, Thomas; Mosterman, Pieter J.; Cazzola, Walter; Costa, Fabio M.; Pierantonio, Alfonso; Tichy, Matthias; Aksit, Mehmet; Emanuelson, Pär; Gang, Huang; Georgantas, Nikolaos; Redlich, David

    2014-01-01

    Modern software systems are often required to adapt their behavior at runtime in order to maintain or enhance their utility in dynamic environments. Models at runtime research aims to provide suitable abstractions, techniques, and tools to manage the complexity of adapting software systems at

  20. On properties of modeling control software for embedded control applications with CSP/CT framework

    NARCIS (Netherlands)

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Karelse, F.

    2003-01-01

    This PROGRESS project (TES.5224) traces a design framework for implementing embedded real-time software for control applications by exploiting its natural concurrency. The paper illustrates the stage of yielded automation in the process of structuring complex control software architectures, modeling

  1. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  2. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  3. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    Science.gov (United States)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  4. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  5. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  6. Path generation algorithm for UML graphic modeling of aerospace test software

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  7. Model-Checking of Component-Based Event-Driven Real-Time Embedded Software

    National Research Council Canada - National Science Library

    Gu, Zonghua; Shin, Kang G

    2005-01-01

    .... We discuss application of model-checking to verify system-level concurrency properties of component-based real-time embedded software based on CORBA Event Service, using Avionics Mission Computing...

  8. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. Presented at the IADIS m-learning 2008 Conference. April, 11-13, 2008, Carvoeiro, Portugal.

  9. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  10. Software Infrastructure to Enable Modeling & Simulation as a Service (M&SaaS), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase 2 project will produce a software service infrastructure that enables most modeling and simulation (M&S) activities from code development and...

  11. Software Test Description (STD) for the Globally Relocatable Navy Tide/Atmospheric Modeling System (PCTides)

    National Research Council Canada - National Science Library

    Posey, Pamela

    2002-01-01

    The purpose of this Software Test Description (STD) is to establish formal test cases to be used by personnel tasked with the installation and verification of the Globally Relocatable Navy Tide/Atmospheric Modeling System (PCTides...

  12. An Approach for the Implementation of Software Quality Models Adpoting CERTICS and CMMI-DEV

    Directory of Open Access Journals (Sweden)

    GARCIA, F.W.

    2015-12-01

    Full Text Available This paper proposes a mapping between two product quality and software processes models used in the industry, the CERTICS national model and the CMMI-DEV international model. The stages of mapping are presented step by step, as well as the mapping review, which had the cooperation of one specialist in CERTICS and CMMI-DEV models. It aims to correlate the structures of the two models in order to facilitate and reduce the implementation time and costs, and to stimulate the execution of multi-model implementations in software developers companies.

  13. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  14. A Module-System Discipline for Model-Driven Software Development

    NARCIS (Netherlands)

    Erdweg, S.T.; Ostermann, Klaus

    2017-01-01

    Model-driven development is a pragmatic approach to software development that embraces domain-specific languages (DSLs), where models correspond to DSL programs. A distinguishing feature of model-driven development is that clients of a model can select from an open set of alternative semantics of

  15. A software development and evolution model based on decision-making

    Science.gov (United States)

    Wild, J. Christian; Dong, Jinghuan; Maly, Kurt

    1991-01-01

    Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.

  16. Web-EEDF: open source software for modeling the electron dynamics

    International Nuclear Information System (INIS)

    Janda, M.; Machala, Z.; Morvova, M.; Francek, V.; Lukac, P.

    2005-01-01

    We present a free software for modeling the electron dynamics in the uniform electric field named Web-EEDF. It uses a Monte Carlo algorithm to calculate electron energy distribution functions (EEDFs) and other plasma parameters in various mixtures. Obtained results are in good agreement with literature. This software represents the first stage in a more complex modeling of plasma chemical processes leading to the decomposition of various air pollutants in electrical discharges at atmospheric pressure (Authors)

  17. Health Technology Assessment of CEM Pulpotomy in Permanent Molars with Irreversible Pulpitis.

    Science.gov (United States)

    Yazdani, Shahram; Jadidfard, Mohammad-Pooyan; Tahani, Bahareh; Kazemian, Ali; Dianat, Omid; Alim Marvasti, Laleh

    2014-01-01

    Teeth with irreversible pulpitis usually undergo root canal therapy (RCT). This treatment modality is often considered disadvantageous as it removes vital pulp tissue and weakens the tooth structure. A relatively new concept has risen which suggests vital pulp therapy (VPT) for irreversible pulpitis. VPT with calcium enriched mixture (VPT/CEM) has demonstrated favorable treatment outcomes when treating permanent molars with irreversible pulpitis. This study aims to compare patient related factors, safety and organizational consideration as parts of health technology assessment (HTA) of the new VPT/CEM biotechnology when compared with RCT. Patient related factors were assessed by looking at short- and long-term clinical success; safety related factors were evaluated by a specialist committee and discussion board involved in formulating healthcare policies. Organizational evaluation was performed and the social implications were assessed by estimating the costs, availability, accessibility and acceptability. The impact of VPT/CEM biotechnology was assessed by investigating the incidence of irreversible pulpitis and the effect of this treatment on reducing the burden of disease. VPT/CEM biotechnology was deemed feasible and acceptable like RCT; however, it was more successful, accessible, affordable, available and also safer than RCT. When considering socioeconomic implications on oral health status and oral health-related quality of life of VPT/CEM, the novel biotechnology can be more effective and more efficient than RCT in mature permanent molars with irreversible pulpitis.

  18. Cytotoxicity of arctigenin and matairesinol against the T-cell lymphoma cell line CCRF-CEM.

    Science.gov (United States)

    Su, Shan; Cheng, Xinlai; Wink, Michael

    2015-09-01

    Arctigenin and matairesinol possess a diversity of bioactivities. Here we investigated the cytotoxicity of arctigenin and matairesinol against a T-cell lymphoma cell line CCRF-CEM and the underlying mechanisms that have not been explored before. The cytotoxic activity was investigated using MTT assay. The cell cycle arrest and reactive oxygen species (ROS) accumulation were determined by flow cytometric analysis. The apoptosis induction was assessed using Annexin V/Propidium Iodide assay. The gene quantification analysis was measured through real-time polymerase chain reaction. Arctigenin and matairesinol exhibited significant antiproliferative activity against CCRF-CEM cells after 72 h treatment with IC50 values of 1.21 ± 0.15 μm and 4.27 ± 0.41 μm, respectively. In addition, both lignans arrest CCRF-CEM cells in the S phase. Furthermore, they could induce apoptosis in CCRF-CEM cells in a concentration- and time-dependent manner. Interestingly, the lignans differentially regulated the expression of several key genes involved in apoptosis pathways, including Bax, Bad and caspase-9. Moreover, both lignans could increase ROS levels in CCRF-CEM cells. Our study provides an insight into the potential of arctigenin and matairesinol as good candidates for the development of novel agents against T-cell lymphoma. © 2015 Royal Pharmaceutical Society.

  19. Health Technology Assessment of CEM Pulpotomy in Permanent Molars with Irreversible Pulpitis

    Science.gov (United States)

    Yazdani, Shahram; Jadidfard, Mohammad-Pooyan; Tahani, Bahareh; Kazemian, Ali; Dianat, Omid; Alim Marvasti, Laleh

    2014-01-01

    Introduction: Teeth with irreversible pulpitis usually undergo root canal therapy (RCT). This treatment modality is often considered disadvantageous as it removes vital pulp tissue and weakens the tooth structure. A relatively new concept has risen which suggests vital pulp therapy (VPT) for irreversible pulpitis. VPT with calcium enriched mixture (VPT/CEM) has demonstrated favorable treatment outcomes when treating permanent molars with irreversible pulpitis. This study aims to compare patient related factors, safety and organizational consideration as parts of health technology assessment (HTA) of the new VPT/CEM biotechnology when compared with RCT. Materials and Methods: Patient related factors were assessed by looking at short- and long-term clinical success; safety related factors were evaluated by a specialist committee and discussion board involved in formulating healthcare policies. Organizational evaluation was performed and the social implications were assessed by estimating the costs, availability, accessibility and acceptability. The impact of VPT/CEM biotechnology was assessed by investigating the incidence of irreversible pulpitis and the effect of this treatment on reducing the burden of disease. Results: VPT/CEM biotechnology was deemed feasible and acceptable like RCT; however, it was more successful, accessible, affordable, available and also safer than RCT. Conclusion: When considering socioeconomic implications on oral health status and oral health-related quality of life of VPT/CEM, the novel biotechnology can be more effective and more efficient than RCT in mature permanent molars with irreversible pulpitis. PMID:24396372

  20. Conceptual Software Reliability Prediction Models for Nuclear Power Plant Safety Systems

    International Nuclear Information System (INIS)

    Johnson, G.; Lawrence, D.; Yu, H.

    2000-01-01

    The objective of this project is to develop a method to predict the potential reliability of software to be used in a digital system instrumentation and control system. The reliability prediction is to make use of existing measures of software reliability such as those described in IEEE Std 982 and 982.2. This prediction must be of sufficient accuracy to provide a value for uncertainty that could be used in a nuclear power plant probabilistic risk assessment (PRA). For the purposes of the project, reliability was defined to be the probability that the digital system will successfully perform its intended safety function (for the distribution of conditions under which it is expected to respond) upon demand with no unintended functions that might affect system safety. The ultimate objective is to use the identified measures to develop a method for predicting the potential quantitative reliability of a digital system. The reliability prediction models proposed in this report are conceptual in nature. That is, possible prediction techniques are proposed and trial models are built, but in order to become a useful tool for predicting reliability, the models must be tested, modified according to the results, and validated. Using methods outlined by this project, models could be constructed to develop reliability estimates for elements of software systems. This would require careful review and refinement of the models, development of model parameters from actual experience data or expert elicitation, and careful validation. By combining these reliability estimates (generated from the validated models for the constituent parts) in structural software models, the reliability of the software system could then be predicted. Modeling digital system reliability will also require that methods be developed for combining reliability estimates for hardware and software. System structural models must also be developed in order to predict system reliability based upon the reliability

  1. Transforming Cobol Legacy Software to a Generic Imperative Model

    National Research Council Canada - National Science Library

    Moraes, DinaL

    1999-01-01

    .... This research develops a transformation system to convert COBOL code into a generic imperative model, recapturing the initial design and deciphering the requirements implemented by the legacy code...

  2. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    Science.gov (United States)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  3. Software engineering the mixed model for genome-wide association studies on large samples.

    Science.gov (United States)

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  4. Integrating environmental component models. Development of a software framework

    NARCIS (Netherlands)

    Schmitz, O.

    2014-01-01

    Integrated models consist of interacting component models that represent various natural and social systems. They are important tools to improve our understanding of environmental systems, to evaluate cause–effect relationships of human–natural interactions, and to forecast the behaviour of

  5. Software Reuse of Mobile Systems based on Modelling

    Directory of Open Access Journals (Sweden)

    Guo Ping

    2016-01-01

    Full Text Available This paper presents an architectural style based modelling approach for architectural design, analysis of mobile systems. The approach is developed based on UML-like meta models and graph transformation techniques to support sound methodological principals, formal analysis and refinement. The approach could support mobile system development.

  6. Harmonic Domain Modeling of a Distribution System Using the DIgSILENT PowerFactory Software

    DEFF Research Database (Denmark)

    Wasilewski, J.; Wiechowski, Wojciech Tomasz; Bak, Claus Leth

    The first part of this paper presents the comparison between two models of distribution system created in computer simulation software PowerFactory (PF). Model A is an exciting simplified equivalent model of the distribution system used by Transmission System Operator (TSO) Eltra for balenced load...

  7. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Nurhayati Ai

    2018-01-01

    Full Text Available Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread questionnaires to consumer, then from those questionnaires we identified 16 variables that needs to be considered on selecting antivirus software. This 16 variables then divided into 5 factors by using factor analysis method in SPSS software. These five factors are security, performance, internal, time and capacity. To rank those factors we spread questionnaires to 6 IT expert then the data is analyzed using AHP method. The result is that performance factors gained the highest rank from all of the other factors. Thus, consumer can select antivirus software by judging the variables in the performance factors. Those variables are software loading speed, user friendly, no excessive memory use, thorough scanning, and scanning virus fast and accurately.

  8. MODELING WIND TURBINES IN THE GRIDLAB-D SOFTWARE ENVIRONMENT

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, J.C.; Schneider, K.P.

    2009-01-01

    In recent years, the rapid expansion of wind power has resulted in a need to more accurately model the effects of wind penetration on the electricity infrastructure. GridLAB-D is a new simulation environment developed for the U.S. Department of Energy (DOE) by the Pacifi c Northwest National Laboratory (PNNL), in cooperation with academic and industrial partners. GridLAB-D was originally written and designed to help integrate end-use smart grid technologies, and it is currently being expanded to include a number of other technologies, including distributed energy resources (DER). The specifi c goal of this project is to create a preliminary wind turbine generator (WTG) model for integration into GridLAB-D. As wind power penetration increases, models are needed to accurately study the effects of increased penetration; this project is a beginning step at examining these effects within the GridLAB-D environment. Aerodynamic, mechanical and electrical power models were designed to simulate the process by which mechanical power is extracted by a wind turbine and converted into electrical energy. The process was modeled using historic atmospheric data, collected over a period of 30 years as the primary energy input. This input was then combined with preliminary models for synchronous and induction generators. Additionally, basic control methods were implemented, using either constant power factor or constant power modes. The model was then compiled into the GridLAB-D simulation environment, and the power outputs were compared against manufacturers’ data and then a variation of the IEEE 4 node test feeder was used to examine the model’s behavior. Results showed the designs were suffi cient for a prototype model and provided output power similar to the available manufacturers’ data. The prototype model is designed as a template for the creation of new modules, with turbine-specifi c parameters to be added by the user.

  9. Do Over or Make Do? Climate Models as a Software Development Challenge (Invited)

    Science.gov (United States)

    Easterbrook, S. M.

    2010-12-01

    We present the results of a comparative study of the software engineering culture and practices at four different earth system modeling centers: the UK Met Office Hadley Centre, the National Center for Atmospheric Research (NCAR), The Max-Planck-Institut für Meteorologie (MPI-M), and the Institut Pierre Simon Laplace (IPSL). The study investigated the software tools and techniques used at each center to assess their effectiveness. We also investigated how differences in the organizational structures, collaborative relationships, and technical infrastructures constrain the software development and affect software quality. Specific questions for the study included 1) Verification and Validation - What techniques are used to ensure that the code matches the scientists’ understanding of what it should do? How effective are these are at eliminating errors of correctness and errors of understanding? 2) Coordination - How are the contributions from across the modeling community coordinated? For coupled models, how are the differences in the priorities of different, overlapping communities of users addressed? 3) Division of responsibility - How are the responsibilities for coding, verification, and coordination distributed between different roles (scientific, engineering, support) in the organization? 4) Planning and release processes - How do modelers decide on priorities for model development, how do they decide which changes to tackle in a particular release of the model? 5) Debugging - How do scientists debug the models, what types of bugs do they find in their code, and how they find them? The results show that each center has evolved a set of model development practices that are tailored to their needs and organizational constraints. These practices emphasize scientific validity, but tend to neglect other software qualities, and all the centers struggle frequently with software problems. The testing processes are effective at removing software errors prior to

  10. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    International Nuclear Information System (INIS)

    Scheidt, Rafael de Faria; Vilain, Patrícia; Dantas, M A R

    2014-01-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers

  11. Accuracy in Orbital Propagation: A Comparison of Predictive Software Models

    Science.gov (United States)

    2017-06-01

    modified atmospheric drag equations, provided by Richard H. Smith, due to the computer’s processing limitations with Brouwer and Hori’s atmospheric model...positions from PPT3 and SGP4, as well as differences in their accuracy. The atmospheric model within each propagator is determined to be the most...effective component of each propagator to test, as the theoretical atmospheric drag calculation methods of PPT3 and SGP4 differ greatly. PPT3 and SGP4

  12. The software-cycle model for re-engineering and reuse

    Science.gov (United States)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  13. OASIS4 – a coupling software for next generation earth system modelling

    Directory of Open Access Journals (Sweden)

    R. Redler

    2010-01-01

    Full Text Available In this article we present a new version of the Ocean Atmosphere Sea Ice Soil coupling software (OASIS4. With this new fully parallel OASIS4 coupler we target the needs of Earth system modelling in its full complexity. The primary focus of this article is to describe the design of the OASIS4 software and how the coupling software drives the whole coupled model system ensuring the synchronization of the different component models. The application programmer interface (API manages the coupling exchanges between arbitrary climate component models, as well as the input and output from and to files of each individual component. The OASIS4 Transformer instance performs the parallel interpolation and transfer of the coupling data between source and target model components. As a new core technology for the software, the fully parallel search algorithm of OASIS4 is described in detail. First benchmark results are discussed with simple test configurations to demonstrate the efficiency and scalability of the software when applied to Earth system model components. Typically the compute time needed to perform the search is in the order of a few seconds and is only weakly dependant on the grid size.

  14. Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models

    Science.gov (United States)

    Chu, A.

    2014-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.

  15. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  16. Structures and scan strategies of software net models

    International Nuclear Information System (INIS)

    Puhr-Westerheide, P.; Sandbaek, H.

    1984-01-01

    The present paper deals with some aspects of plant control and monitoring systems as used in nuclear power plants. These aspects concern executable net models to run on computers. A short survey on the nets' environment and on some net scan strategies is given. Among the strategies are the 'topologically ordered scan' and the 'signal propagation scan'. A combined method 'topologically ordered signal propagation (TOSIP) scan' will be outlined as well as a net model data structure that allows the definition of subsystems for the use of clear structuration and dischargement to distributed systems. (author)

  17. Shear beams in finite element modelling : Software implementation and validation

    NARCIS (Netherlands)

    Schreppers, G.J.; Hendriks, M.A.N.; Boer, A.; Ferreira, D.; Kikstra, W.P.

    2015-01-01

    Fiber models for beam and shell elements allow for relatively rapid finite element analysis of concrete structures and structural elements. This project aims at the development of the formulation of such elements and a pilot implementation. The reduction of calculation time and degrees of freedom

  18. Multi-physics fluid-structure interaction modelling software

    CSIR Research Space (South Africa)

    Malan, AG

    2008-11-01

    Full Text Available The CSIR reseachers developed a new ground-breaking sofware modelling technlogies to be used in the design of safe and efficient next-generation aircraft. The field of Fluid-structure interaction (FIS) covers a massive range of engineering problems...

  19. Development of a plug-in for Variability Modeling in Software Product Lines

    Directory of Open Access Journals (Sweden)

    María Lucía López-Araujo

    2012-03-01

    Full Text Available Las Líneas de Productos de Software (LPS toman ventaja económica de las similitudes y variación entre un conjunto de sistemas de software dentro de un dominio específico. La Ingeniería de Líneas de Productos de Software por lo tanto, define una serie de procesos para el desarrollo de LPS que consideran las similitudes y variación a lo largo del ciclo devida. El modelado de variabilidad, en consecuencia, es una actividad esencial en un enfoque de Ingeniería de Líneas de Productos de Software. Existen varias técnicas para modelado de variabilidad. Entre ellas resalta COVAMOF que permite modelar los puntos de variación, variantes y dependencias como entidades de primera clase, proporcionando una manera uniforme de representarlos en los diversos niveles de abstracción de una LPS. Para poder aprovechar los beneficios de COVAMOF es necesario contar con una herramienta, de otra manera el modelado y la administración de la variabilidad pueden resultar una labor ardua para el ingeniero de software. Este trabajo presenta el desarrollo de un plug-in de COVAMOF para Eclipse.Software Product Lines (SPL take economic advantage of commonality and variability among a set of software systems that exist within a specific domain. Therefore, Software Product Line Engineering defines a series of processes for the development of a SPL that consider commonality and variability during the software life cycle. Variability modeling is therefore an essential activity in a Software Product Line Engineering approach. There are several techniques for variability modeling nowadays. COVAMOF stands out among them since it allows the modeling of variation points, variants and dependencies as first class elements. COVAMOF, therefore, provides an uniform manner for representing such concepts in different levels of abstraction within a SPL. In order to take advantage of COVAMOF benefits, it is necessary to have a computer aided tool, otherwise variability modeling and

  20. Empirical Study of Homogeneous and Heterogeneous Ensemble Models for Software Development Effort Estimation

    Directory of Open Access Journals (Sweden)

    Mahmoud O. Elish

    2013-01-01

    Full Text Available Accurate estimation of software development effort is essential for effective management and control of software development projects. Many software effort estimation methods have been proposed in the literature including computational intelligence models. However, none of the existing models proved to be suitable under all circumstances; that is, their performance varies from one dataset to another. The goal of an ensemble model is to manage each of its individual models’ strengths and weaknesses automatically, leading to the best possible decision being taken overall. In this paper, we have developed different homogeneous and heterogeneous ensembles of optimized hybrid computational intelligence models for software development effort estimation. Different linear and nonlinear combiners have been used to combine the base hybrid learners. We have conducted an empirical study to evaluate and compare the performance of these ensembles using five popular datasets. The results confirm that individual models are not reliable as their performance is inconsistent and unstable across different datasets. Although none of the ensemble models was consistently the best, many of them were frequently among the best models for each dataset. The homogeneous ensemble of support vector regression (SVR, with the nonlinear combiner adaptive neurofuzzy inference systems-subtractive clustering (ANFIS-SC, was the best model when considering the average rank of each model across the five datasets.

  1. A model of cloud application assignments in software-defined storages

    Science.gov (United States)

    Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander

    2017-01-01

    The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.

  2. Development and implementation of pharmaceutical care planning software for nursing homes based on the Fleetwood Model.

    Science.gov (United States)

    Lapane, Kate L; Hiris, Jeffrey; Hughes, Carmel M; Feinberg, Janice

    2006-12-15

    The effectiveness of pharmaceutical care planning software for nursing homes and the extent to which the software assisted in the implementation of the Fleetwood Model are described. During the study, one long-term-care pharmacy identified 13 nursing homes to participate in the intervention group of a study evaluating the effectiveness of the Fleetwood Model. To successfully implement the Fleetwood Model, which demands prospective drug regimen review and collaborative practices between dispensing and consultant pharmacists, a software system that exchanged information between these pharmacists was deemed necessary. Pharmacists' self-reported assessments of the use of the software and the technical difficulties reported with its use were collected. The number of interventions performed by pharmacist type, the proportion of residents receiving interventions by multiple pharmacists, and the extent to which the interventions were prospective and performed before the mandated 30-day review were estimated from data documented in the software. The consistency of software use by the pharmacists was also estimated. Seventy-one percent of dispensing pharmacists and 40% of consultant pharmacists reported using the software most or all of the time. Fourteen percent of dispensing pharmacists and 40% of consultant pharmacists reported technical difficulties with the software. Over half of newly admitted or readmitted residents received a Fleetwood intervention within 3 days of admittance into the nursing home-71.2% occurred in less than 30 days of admission. The use of information technology to increase communication among health care professionals and assist in providing prospective drug regimen review in long-term-care facilities is feasible. Collaboration and extensive field testing with end users, realistic expectations, appropriate training, and technical support are necessary when implementing new technology.

  3. Development of a highly efficient conversion electron Moessbauer spectroscopy (CEMS) detector for low temperature (xPb1-x)Te bilayers

    International Nuclear Information System (INIS)

    Pombo, Carlos Jose da Silva Matos

    2006-01-01

    The 57 Fe Moessbauer spectroscopy is a nuclear, non-destructive technique used for the investigation of structural, magnetic and hyperfine properties of several materials. It is a powerful tool in characterizing materials in physics, metallurgy, geology and biology field areas, especially magnetic materials, alloys and minerals containing Fe. Lately, the Conversion Electron Moessbauer Spectroscopy (CEMS) is widely used in making studies on ultra-thin magnetic films, as well as other nanostructured materials. In case of magnetic nanostructures, low temperature (LT) studies are especially important due to the possibility of dealing with superparamagnetic effects. In this work it was developed a CEMS measurement system for low temperatures ( R ) and an optical cryostat (Model SVT-400, Janis Research Co, USA), from which the project was originally conceived at the Applied Physics / Moessbauer spectroscopy Department from University of Duisburg-Essen, Germany. The LT-CEMS system was fully built, tested and successfully applied in a preliminary characterization of Fe/(Eu x Pb 1-x )Te(111) bilayers with use of a 15 angstrom, 57 Fe probe layer, with reasonable results at sample temperatures as low as 8 K. (author)

  4. An Overview of Mesoscale Modeling Software for Energetic Materials Research

    Science.gov (United States)

    2010-03-01

    molecular dynamics program by NAgoya Cooperation ( COGNAC ) • Polymer rheology Analyzer with Slip-link model of entanglement (PASTA) • Simulation...FEM, and self-consistent field method. Detailed descriptions of the four simulation programs are below: • COGNAC ―A molecular dynamics program that...code 2. Available on Windows, Linux and MacOSX operating systems 3. Common GUI 4. COGNAC a. Density biased Monte Carlo and density biased

  5. A testing-coverage software reliability model considering fault removal efficiency and error generation.

    Science.gov (United States)

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.

  6. Model-Driven Robot-Software Design using integrated Models and Co-Simulation

    NARCIS (Netherlands)

    Broenink, Johannes F.; Ni, Yunyun; McAllister, J.; Bhattacharyya, S.

    2012-01-01

    The work presented here is on a methodology for design of hard real-time embedded control software for robots, i.e. mechatronic products. The behavior of the total robot system (machine, control, software and I/O) is relevant, because the dynamics of the machine influences the robot software.

  7. Algorithms and Software for Predictive and Perceptual Modeling of Speech

    CERN Document Server

    Atti, Venkatraman

    2010-01-01

    From the early pulse code modulation-based coders to some of the recent multi-rate wideband speech coding standards, the area of speech coding made several significant strides with an objective to attain high quality of speech at the lowest possible bit rate. This book presents some of the recent advances in linear prediction (LP)-based speech analysis that employ perceptual models for narrow- and wide-band speech coding. The LP analysis-synthesis framework has been successful for speech coding because it fits well the source-system paradigm for speech synthesis. Limitations associated with th

  8. Investigation of a model to verify software for 3-D static force calculation

    OpenAIRE

    Takahashi, Norio; Nakata, Takayoshi; Morishige, H.

    1994-01-01

    Requirements for a model to verify software for 3-D static force calculation are examined, and a 3-D model for static force calculation is proposed. Some factors affecting the analysis and experiments are investigated in order to obtain accurate and reproducible results

  9. A model independent S/W framework for search-based software testing.

    Science.gov (United States)

    Oh, Jungsup; Baik, Jongmoon; Lim, Sung-Hwa

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model.

  10. APPLYING TEACHING-LEARNING TO ARTIFICIAL BEE COLONY FOR PARAMETER OPTIMIZATION OF SOFTWARE EFFORT ESTIMATION MODEL

    Directory of Open Access Journals (Sweden)

    THANH TUNG KHUAT

    2017-05-01

    Full Text Available Artificial Bee Colony inspired by the foraging behaviour of honey bees is a novel meta-heuristic optimization algorithm in the community of swarm intelligence algorithms. Nevertheless, it is still insufficient in the speed of convergence and the quality of solutions. This paper proposes an approach in order to tackle these downsides by combining the positive aspects of TeachingLearning based optimization and Artificial Bee Colony. The performance of the proposed method is assessed on the software effort estimation problem, which is the complex and important issue in the project management. Software developers often carry out the software estimation in the early stages of the software development life cycle to derive the required cost and schedule for a project. There are a large number of methods for effort estimation in which COCOMO II is one of the most widely used models. However, this model has some restricts because its parameters have not been optimized yet. In this work, therefore, we will present the approach to overcome this limitation of COCOMO II model. The experiments have been conducted on NASA software project dataset and the obtained results indicated that the improvement of parameters provided better estimation capabilities compared to the original COCOMO II model.

  11. Prediction Model for Object Oriented Software Development Effort Estimation Using One Hidden Layer Feed Forward Neural Network with Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Chandra Shekhar Yadav

    2014-01-01

    Full Text Available The budget computation for software development is affected by the prediction of software development effort and schedule. Software development effort and schedule can be predicted precisely on the basis of past software project data sets. In this paper, a model for object-oriented software development effort estimation using one hidden layer feed forward neural network (OHFNN has been developed. The model has been further optimized with the help of genetic algorithm by taking weight vector obtained from OHFNN as initial population for the genetic algorithm. Convergence has been obtained by minimizing the sum of squared errors of each input vector and optimal weight vector has been determined to predict the software development effort. The model has been empirically validated on the PROMISE software engineering repository dataset. Performance of the model is more accurate than the well-established constructive cost model (COCOMO.

  12. Developing interpretable models with optimized set reduction for identifying high risk software components

    Science.gov (United States)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  13. POWERLIB: SAS/IML Software for Computing Power in Multivariate Linear Models

    Directory of Open Access Journals (Sweden)

    Jacqueline L. Johnson

    2009-04-01

    Full Text Available The POWERLIB SAS/IML software provides convenient power calculations for a widerange of multivariate linear models with Gaussian errors. The software includes the Box,Geisser-Greenhouse, Huynh-Feldt, and uncorrected tests in the univariate" approach torepeated measures (UNIREP, the Hotelling Lawley Trace, Pillai-Bartlett Trace, andWilks Lambda tests in multivariate" approach (MULTIREP, as well as a limited butuseful range of mixed models. The familiar univariate linear model with Gaussian errorsis an important special case. For estimated covariance, the software provides condencelimits for the resulting estimated power. All power and condence limits values canbe output to a SAS dataset, which can be used to easily produce plots and tables formanuscripts.

  14. On Fundamental Evaluation Using Uav Imagery and 3d Modeling Software

    Science.gov (United States)

    Nakano, K.; Suzuki, H.; Tamino, T.; Chikatsu, H.

    2016-06-01

    Unmanned aerial vehicles (UAVs), which have been widely used in recent years, can acquire high-resolution images with resolutions in millimeters; such images cannot be acquired with manned aircrafts. Moreover, it has become possible to obtain a surface reconstruction of a realistic 3D model using high-overlap images and 3D modeling software such as Context capture, Pix4Dmapper, Photoscan based on computer vision technology such as structure from motion and multi-view stereo. 3D modeling software has many applications. However, most of them seem to not have obtained appropriate accuracy control in accordance with the knowledge of photogrammetry and/or computer vision. Therefore, we performed flight tests in a test field using an UAV equipped with a gimbal stabilizer and consumer grade digital camera. Our UAV is a hexacopter and can fly according to the waypoints for autonomous flight and can record flight logs. We acquired images from different altitudes such as 10 m, 20 m, and 30 m. We obtained 3D reconstruction results of orthoimages, point clouds, and textured TIN models for accuracy evaluation in some cases with different image scale conditions using 3D modeling software. Moreover, the accuracy aspect was evaluated for different units of input image—course unit and flight unit. This paper describes the fundamental accuracy evaluation for 3D modeling using UAV imagery and 3D modeling software from the viewpoint of close-range photogrammetry.

  15. ON FUNDAMENTAL EVALUATION USING UAV IMAGERY AND 3D MODELING SOFTWARE

    Directory of Open Access Journals (Sweden)

    K. Nakano

    2016-06-01

    Full Text Available Unmanned aerial vehicles (UAVs, which have been widely used in recent years, can acquire high-resolution images with resolutions in millimeters; such images cannot be acquired with manned aircrafts. Moreover, it has become possible to obtain a surface reconstruction of a realistic 3D model using high-overlap images and 3D modeling software such as Context capture, Pix4Dmapper, Photoscan based on computer vision technology such as structure from motion and multi-view stereo. 3D modeling software has many applications. However, most of them seem to not have obtained appropriate accuracy control in accordance with the knowledge of photogrammetry and/or computer vision. Therefore, we performed flight tests in a test field using an UAV equipped with a gimbal stabilizer and consumer grade digital camera. Our UAV is a hexacopter and can fly according to the waypoints for autonomous flight and can record flight logs. We acquired images from different altitudes such as 10 m, 20 m, and 30 m. We obtained 3D reconstruction results of orthoimages, point clouds, and textured TIN models for accuracy evaluation in some cases with different image scale conditions using 3D modeling software. Moreover, the accuracy aspect was evaluated for different units of input image—course unit and flight unit. This paper describes the fundamental accuracy evaluation for 3D modeling using UAV imagery and 3D modeling software from the viewpoint of close-range photogrammetry.

  16. Robust recurrent neural network modeling for software fault detection and correction prediction

    International Nuclear Information System (INIS)

    Hu, Q.P.; Xie, M.; Ng, S.H.; Levitin, G.

    2007-01-01

    Software fault detection and correction processes are related although different, and they should be studied together. A practical approach is to apply software reliability growth models to model fault detection, and fault correction process is assumed to be a delayed process. On the other hand, the artificial neural networks model, as a data-driven approach, tries to model these two processes together with no assumptions. Specifically, feedforward backpropagation networks have shown their advantages over analytical models in fault number predictions. In this paper, the following approach is explored. First, recurrent neural networks are applied to model these two processes together. Within this framework, a systematic networks configuration approach is developed with genetic algorithm according to the prediction performance. In order to provide robust predictions, an extra factor characterizing the dispersion of prediction repetitions is incorporated into the performance function. Comparisons with feedforward neural networks and analytical models are developed with respect to a real data set

  17. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    and standardized process assets that can be reused, modified, and extended using a well-defined customization approach. Hence, process engineers can ground context-specific process variants in a standardized or domain-specific reference model that can be adapted to the respective context. We present an approach......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-defined...

  18. Comparison of Software Models for Energy Savings from Cool Roofs

    Energy Technology Data Exchange (ETDEWEB)

    New, Joshua Ryan [ORNL; Miller, William A [ORNL; Huang, Yu (Joe) [White Box Technologies; Levinson, Ronnen [Lawrence Berkeley National Laboratory (LBNL)

    2014-01-01

    A web-based Roof Savings Calculator (RSC) has been deployed for the United States Department of Energy as an industry-consensus tool to help building owners, manufacturers, distributors, contractors and researchers easily run complex roof and attic simulations. This tool employs modern web technologies, usability design, and national average defaults as an interface to annual simulations of hour-by-hour, whole-building performance using the world-class simulation tools DOE-2.1E and AtticSim in order to provide estimated annual energy and cost savings. In addition to cool reflective roofs, RSC simulates multiple roof and attic configurations including different roof slopes, above sheathing ventilation, radiant barriers, low-emittance roof surfaces, duct location, duct leakage rates, multiple substrate types, and insulation levels. A base case and energy-efficient alternative can be compared side-by-side to estimate monthly energy. RSC was benchmarked against field data from demonstration homes in Ft. Irwin, California; while cooling savings were similar, heating penalty varied significantly across different simulation engines. RSC results reduce cool roofing cost-effectiveness thus mitigating expected economic incentives for this countermeasure to the urban heat island effect. This paper consolidates comparison of RSC s projected energy savings to other simulation engines including DOE-2.1E, AtticSim, Micropas, and EnergyPlus, and presents preliminary analyses. RSC s algorithms for capturing radiant heat transfer and duct interaction in the attic assembly are considered major contributing factors to increased cooling savings and heating penalties. Comparison to previous simulation-based studies, analysis on the force multiplier of RSC cooling savings and heating penalties, the role of radiative heat exchange in an attic assembly, and changes made for increased accuracy of the duct model are included.

  19. Software module for geometric product modeling and NC tool path generation

    International Nuclear Information System (INIS)

    Sidorenko, Sofija; Dukovski, Vladimir

    2003-01-01

    The intelligent CAD/CAM system named VIRTUAL MANUFACTURE is created. It is consisted of four intelligent software modules: the module for virtual NC machine creation, the module for geometric product modeling and automatic NC path generation, the module for virtual NC machining and the module for virtual product evaluation. In this paper the second intelligent software module is presented. This module enables feature-based product modeling carried out via automatic saving of the designed product geometric features as knowledge data. The knowledge data are afterwards applied for automatic NC program generation for the designed product NC machining. (Author)

  20. Towards a Complete Model for Software Component Deployment on Heterogeneous Platform

    Directory of Open Access Journals (Sweden)

    Švogor Ivan

    2014-12-01

    Full Text Available This report briefly describes an ongoing research related to optimization of allocating software components to heterogeneous computing platform (which includes CPU, GPU and FPGA. Research goal is also presented, along with current hot topics of the research area, related research teams, and finally results and contribution of my research. It involves mathematical modelling which results in goal function, optimization method which finds a suboptimal solution to the goal function and a software modeling tool which enables graphical representation of the problem at hand and help developers determine component placement in the system design phase.

  1. 40 CFR Table 3 of Subpart Aaaa to... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 3 Table 3 of Subpart AAAA to Part 60 Protection of Environment... following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1. Nitrogen Oxides (Class...

  2. Critical Thinking Skills of Students through Mathematics Learning with ASSURE Model Assisted by Software Autograph

    Science.gov (United States)

    Kristianti, Y.; Prabawanto, S.; Suhendra, S.

    2017-09-01

    This study aims to examine the ability of critical thinking and students who attain learning mathematics with learning model ASSURE assisted Autograph software. The design of this study was experimental group with pre-test and post-test control group. The experimental group obtained a mathematics learning with ASSURE-assisted model Autograph software and the control group acquired the mathematics learning with the conventional model. The data are obtained from the research results through critical thinking skills tests. This research was conducted at junior high school level with research population in one of junior high school student in Subang Regency of Lesson Year 2016/2017 and research sample of class VIII student in one of junior high school in Subang Regency for 2 classes. Analysis of research data is administered quantitatively. Quantitative data analysis was performed on the normalized gain level between the two sample groups using a one-way anova test. The results show that mathematics learning with ASSURE assisted model Autograph software can improve the critical thinking ability of junior high school students. Mathematical learning using ASSURE-assisted model Autograph software is significantly better in improving the critical thinking skills of junior high school students compared with conventional models.

  3. Contribuição dos modelos de qualidade e maturidade na melhoria dos processos de software Contribution of quality and maturity models to software process improvement

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Tonini

    2008-01-01

    Full Text Available Grande parte das empresas desenvolvedoras de software criou seu próprio processo de trabalho. Devido à rápida expansão do mercado de software, a concorrência ocorre muito mais em custo do que em diferenciação. Para obter vantagem competitiva, as empresas devem atualizar-se continuamente na tecnologia, buscar a maturidade nos processos e eliminar a ineficiência operacional. Isso requer um envolvimento das pessoas, dos processos e da organização como um todo. O artigo discute a implementação de melhorias nos processos de software segundo os principais modelos de qualidade e de maturidade. Com base em um Estudo de Casos Múltiplos, verifica-se que a melhoria dos processos de software requer que a melhoria ocorra primeiramente entre cada um dos desenvolvedores e, a seguir, envolva os grupos de desenvolvimento e por fim, a organização como um todo. A pesquisa conclui que os modelos de qualidade e maturidade servem como orientadores do processo de melhoria.Many software development companies have developed their own work method. Due to the fast software market growth, the competition focuses more on cost than on differentiation. To achieve competitive advantage, software developer organizations must continually update their technology, reach high level process maturity and eliminate all the operational inefficiency. These procedures involve people, processes and the whole organization. The aim of the paper is to discuss software process improvement implementation according to the most important quality and maturity models. Based on a Multiple Case Study, it is verified that the software process improvement needs firstly individual improvement and, later, it involves the developer teams and the whole organization. The research concludes that the quality and maturity models must be used as improvement process drivers.

  4. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  5. New Modelling Capabilities in Commercial Software for High-Gain Antennas

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Lumholt, Michael; Meincke, Peter

    2012-01-01

    characterization of the reflectarray element, an initial phaseonly synthesis, followed by a full optimization procedure taking into account the near-field from the feed and the finite extent of the array. Another interesting new modelling capability is made available through the DIATOOL software, which is a new......This paper presents an overview of selected new modelling algorithms and capabilities in commercial software tools developed by TICRA. A major new area is design and analysis of printed reflectarrays where a fully integrated design environment is under development, allowing fast and accurate...... type of EM software tool aimed at extending the ways engineers can use antenna measurements in the antenna design process. The tool allows reconstruction of currents and near fields on a 3D surface conformal to the antenna, by using the measured antenna field as input. The currents on the antenna...

  6. A MODEL FOR INTEGRATED SOFTWARE TO IMPROVE COMMUNICATION POLICY IN DENTAL TECHNICAL LABS

    Directory of Open Access Journals (Sweden)

    Minko M. Milev

    2017-06-01

    Full Text Available Introduction: Integrated marketing communications (IMC are all kinds of communications between organisations and customers, partners, other organisations and society. Aim: To develop and present an integrated software model, which can improve the effectiveness of communications in dental technical services. Material and Methods: The model of integrated software is based on recommendations of a total of 700 respondents (students of dental technology, dental physicians, dental technicians and patients of dental technical laboratories in Northeastern Bulgaria. Results and Discussion: We present the benefits of future integrated software to improve the communication policy in the dental technical laboratory that meets the needs of fast cooperation and well-built communicative network between dental physicians, dental technicians, patients and students. Conclusion: The use of integrated communications could be a powerful unified approach to improving the communication policy between all players at the market of dental technical services.

  7. Adoption of open source software in business models: a Red Hat and IBM case study

    CSIR Research Space (South Africa)

    Munga, N

    2009-10-01

    Full Text Available beyond those of customer demand. Linux provided it with a common set of APIs across its entire product line, providing a unified architecture for software developers. IBM refocused on targeting its traditional large corporate customers, and the need... and its position in the market identified [15]. Various types of business models are discussed in the literature. Rappa [16], using the customer relationship as the primary dimension, defines the brokerage model, information intermediary model...

  8. CEM: Increasing productivity through the management and monitoring of experiences provided to customers

    Directory of Open Access Journals (Sweden)

    Adriana Arineli

    2015-12-01

    Full Text Available Dealing with intangible and so subtle experience is unusual and a huge challenge for management that is not used to measure what has no numbers, but maybe they need to see beyond the obvious and accessible statistics. Recently, several studies point to the importance of customer experience management (CEM. However, if the CEM is a strategy to focus on operations and processes of a business around the customers’ experiences with the company, it is essential to seek grants to structure it and find out its effectiveness. This study examines the issues involved in offering superior customer experiences on fashion retail stores in Brazil, identifying the relation between productivity and CEM. Through a research with managers of three important Brazilian clothing retail chains, it was possible to analyze the aspects that impact on the customer experience and their relevance. A questionnaire was applied to evaluate 23 variables that make up the experience of the customer and their impact on increasing productivity. Some statistical techniques were used for data processing and it was possible to realize that only 4 of the 23 items were not relevant for customer experience. It can be concluded that CEM is effective in increasing productivity and can be used as a guideline matrix management in decision-making to promote superior customer experiences. Specific characteristics of each segment suggest different impacts in every aspect. Therefore, it is crucial that each segment review its own variables that will structure the CEM. Even assuming that it is defiant see beyond the obvious, maybe this is the necessary opportunity to create real competitive advantage and longevity for companies that want to stand out and be successful over time.

  9. The Model of Formation of Professional Competence of Future Software Engineers

    Directory of Open Access Journals (Sweden)

    Viktor Sedov

    2016-05-01

    Full Text Available The rapid technological development of modern society fundamentally changes processes of production, communication and services. There is a great demand for specialists who are competent in recently emerged industries. Moreover, the gap between scientific invention and its wide distribution and consumption has significantly reduced. Therefore, we face an urgent need for preparation of specialists in higher education that meet the requirements of modern society and labour market. Particularly relevant is the issue of training of future software engineers in the system of master’s degree, which is the level of education that trains not only professionals, but also scientists and university teachers. The article presents a developed model of formation of professional competence of future software engineers in the system of master’s degree. The model comprises units of training of future software engineers, identifies methodological approaches, a number of general didactic and methodological principles that underpin learning processes in higher education. It describes methods, forms of organization and means that are used in the system of master’s degree, and also provides pedagogical conditions of effective implementation of the model. The developed model addresses the issue of individualization, intensification and optimization of studying. While developing the model, special attention was paid to updating the content of education and searching for new organizational forms of training of future software engineers.

  10. Implementation of Software Configuration Management Process by Models: Practical Experiments and Learned Lessons

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-12-01

    Full Text Available Nowadays software configuration management process is not only dilemma which system should be used for version control or how to merge changes from one source code branch to other. There are multiple tasks such as version control, build management, deploy management, status accounting, bug tracking and many others that should be solved to support full configuration management process according to most popular quality standards. The main scope of the mentioned process is to include only valid and tested software items to final version of product and prepare a new version as soon as possible. To implement different tasks of software configuration management process, a set of different tools, scripts and utilities should be used. The current paper provides a new model-based approach to implementation of configuration management. Using different models, a new approach helps to organize existing solutions and develop new ones by a parameterized way, thus increasing reuse of solutions. The study provides a general description of new model-based conception and definitions of all models needed to implement a new approach. The second part of the paper contains an overview of criteria, practical experiments and lessons learned from using new models in software configuration management. Finally, further works are defined based on results of practical experiments and lessons learned.

  11. KEEFEKTIFAN SOFTWARE GEOMETER’S SKETCHPAD DALAM PEMBELAJARAN MODEL PASID TERHADAP KEMAMPUAN PEMECAHAN MASALAH SISWA

    Directory of Open Access Journals (Sweden)

    Khalid Zulfikar Dewantoro

    2014-08-01

    Full Text Available Tujuan penelitian ini adalah untuk mengetahui keefektifan software Geometer’s Sketchpad (GSP dalam pembelajaran model PASID (Pembelajaran Analitik Sintetik Intervensi Divergen terhadap kemampuan pemecahan masalah siswa. Populasi dalam penelitian ini adalah siswa kelas VII SMP PGRI 01 Semarang tahun pelajaran 2012/2013. Desain penelitian ini merupakan penelitian eksperimen. Dari delapan kelas dipilih dua kelas yaitu VII F sebagai kelas eksperimen I yang diterapkan software GSP dalam pembelajaran model PASID, kelas VII E sebagai kelas eksperimen II yang diterapkan pembelajaran model PASID.  Hasil penelitian menunjukkan bahwa hasil belajar siswa pada aspek kemampuan pemecahan masalah menggunakan software GSP dalam pembelajaran model PASID mencapai ketuntasan klasikal, persentase hasil belajar siswa pada aspek kemampuan pemecahan masalah pada kelas eksperimen I sama dengan kelas eksperimen II , rata-rata hasil belajar siswa pada aspek kemampuan pemecahan masalah pada kelas eksperimen I lebih baik dari kelas eksperimen II. Berdasarkan hasil tersebut dapat disimpulkan bahwa pembelajaran menggunakan software GSP dalam pembelajaran model PASID efektif terhadap kemampuan pemecahan masalah siswa.

  12. Entrepreneurial model based technology creative industries sector software through the use of free open source software for Universitas Pendidikan Indonesia students

    Science.gov (United States)

    Hasan, B.; Hasbullah; Purnama, W.; Hery, A.

    2016-04-01

    Creative industry development areas of software by using Free Open Source Software (FOSS) is expected to be one of the solutions to foster new entrepreneurs of the students who can open job opportunities and contribute to economic development in Indonesia. This study aims to create entrepreneurial coaching model based on the creative industries by utilizing FOSS software field as well as provide understanding and fostering entrepreneurial creative industries based field software for students of Universitas Pendidikan Indonesia. This activity phase begins with identifying entrepreneurs or business software technology that will be developed, training and mentoring, apprenticeship process at industrial partners, creation of business plans and monitoring and evaluation. This activity involves 30 UPI student which has the motivation to self-employment and have competence in the field of information technology. The results and outcomes expected from these activities is the birth of a number of new entrepreneurs from the students engaged in the software industry both software in the world of commerce (e-commerce) and education/learning (e-learning/LMS) and games.

  13. From Principles to Details: Integrated Framework for Architecture Modelling of Large Scale Software Systems

    Directory of Open Access Journals (Sweden)

    Andrzej Zalewski

    2013-06-01

    Full Text Available There exist numerous models of software architecture (box models, ADL’s, UML, architectural decisions, architecture modelling frameworks (views, enterprise architecture frameworks and even standards recommending practice for the architectural description. We show in this paper, that there is still a gap between these rather abstract frameworks/standards and existing architecture models. Frameworks and standards define what should be modelled rather than which models should be used and how these models are related to each other. We intend to prove that a less abstract modelling framework is needed for the effective modelling of large scale software intensive systems. It should provide a more precise guidance kinds of models to be employed and how they should relate to each other. The paper defines principles that can serve as base for an integrated model. Finally, structure of such a model has been proposed. It comprises three layers: the upper one – architectural policy – reflects corporate policy and strategies in architectural terms, the middle one –system organisation pattern – represents the core structural concepts and their rationale at a given level of scope, the lower one contains detailed architecture models. Architectural decisions play an important role here: they model the core architectural concepts explaining detailed models as well as organise the entire integrated model and the relations between its submodels.

  14. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  15. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    Science.gov (United States)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  16. Complete modeling and software implementation of a virtual solar hydrogen hybrid system

    International Nuclear Information System (INIS)

    Pedrazzi, S.; Zini, G.; Tartarini, P.

    2010-01-01

    A complete mathematical model and software implementation of a solar hydrogen hybrid system has been developed and applied to real data. The mathematical model has been derived from sub-models taken from literature with appropriate modifications and improvements. The model has been implemented as a stand-alone virtual energy system in a model-based, multi-domain software environment. A test run has then been performed on typical residential user data-sets over a year-long period. Results show that the virtual hybrid system can bring about complete grid independence; in particular, hydrogen production balance is positive (+1.25 kg) after a year's operation with a system efficiency of 7%.

  17. Towards a framework for deriving platform-independent model-driven software product lines

    Directory of Open Access Journals (Sweden)

    Andrés Paz

    2013-05-01

    Full Text Available Model-driven software product lines (MD-SPLs are created from domain models which are transformed, merged and composed with reusable core assets, until software products are produced. Model transformation chains (MTCs must be specified to generate such MD-SPLs. This paper presents a framework for creating platform-independent MD-SPLs; such framework includes a domain specific language (DSL for platform-independent MTC specification and facilities platform-specific MTC generation of several of the most used model transformation frameworks. The DSL also allows product line architects to compose generation taking the need for model transformation strategy and technology interoperability into account and specifying several types of variability involved in such generation.

  18. On model-driven design of robot software using co-simulation

    NARCIS (Netherlands)

    Broenink, Johannes F.; Ni, Yunyun; Groothuis, M.A.; Menegatti, E.

    2010-01-01

    In this paper we show that using co-simulation for robot software design will be more efficient than without co-simulation. We will show an example of the plotter how the co-simulation is helping with the design process. We believe that a collaborative methodology based on model-driven design will

  19. Analysis of 3D Modeling Software Usage Patterns for K-12 Students

    Science.gov (United States)

    Wu, Yi-Chieh; Liao, Wen-Hung; Chi, Ming-Te; Li, Tsai-Yen

    2016-01-01

    In response to the recent trend in maker movement, teachers are learning 3D techniques actively and bringing 3D printing into the classroom to enhance variety and creativity in designing lectures. This study investigates the usage pattern of a 3D modeling software, Qmodel Creator, which is targeted at K-12 students. User logs containing…

  20. Prowess–A Software Model for the Ooty Wide Field Array

    Indian Academy of Sciences (India)

    Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a ...

  1. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  2. Semantic Model of Variability and Capabilities of IoT Applications for Embedded Software Ecosystems

    DEFF Research Database (Denmark)

    Tomlein, Matus; Grønbæk, Kaj

    2016-01-01

    Applications in embedded open software ecosystems for Internet of Things devices open new challenges regarding how their variability and capabilities should be modeled. In collaboration with an industrial partner, we have recognized that such applications have complex constraints on the context. We...

  3. Prowess – A Software Model for the Ooty Wide Field Array

    Indian Academy of Sciences (India)

    Although predictions spell out optimistic outcomes in reasonable integration times, these studies were based purely on analytical assumptions, without accounting for limiting systematics. A software model for OWFA has been developed with a view to understanding the instrument-induced systematics, by describing a ...

  4. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong,T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. In I. A. Sánchez & P. Isaías (Eds.), Proceedings of the IADIS Mobile Learning Conference 2008 (pp. 206-210). April, 11-13, 2008, Carvoeiro, Portugal.

  5. Charging Customers or Making Profit? Business Model Change in the Software Industry

    Directory of Open Access Journals (Sweden)

    Margit Malmmose Peyton

    2014-08-01

    Full Text Available Purpose: Advancements in technology, changing customer demands or new market entrants are often seen as a necessary condition to trigger the creation of new Business Models, or disruptive change in existing ones. Yet, the sufficient condition is often determined by pricing and how customers are willing to pay for the technology (Chesbrough and Rosenbloom, 2002. As a consequence, much research on Business Models has focused on innovation and technology management (Rajala et al., 2012; Zott et al., 2011, and software-specific frameworks for Business Models have emerged (Popp, 2011; Rajala et al., 2003; Rajala et al., 2004; Stahl, 2004. This paper attempts to illustrate Business Model change in the software industry. Design: Drawing on Rajala et al. (2003, this case study explores the (1 antecedents and (2 consequences of a Business Model-change in a logistics software company. The company decided to abolish their profitable fee-based licensing for an internet-based version of its core product and to offer it as freeware including unlimited service. Findings: Firstly, we illustrate how external developments in technology and customer demands (pricing, as well as the desire for a sustainable Business Model, have led to this drastic change. Secondly, we initially find that much of the company’s new Business Model is congruent with the company-focused framework of Rajala et al. (2003 [product strategy; distribution model, services and implementation; revenue logic]. Value: The existing frameworks for Business Models in the software industry cannot fully explain the disruptive change in the Business Model. Therefore, we suggest extending the framework by the element of ‘innovation’.

  6. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  7. Evolving a Simulation Model Product Line Software Architecture from Heterogeneous Model Representations

    Science.gov (United States)

    2003-09-01

    HNP97] also cited similar software safety issues as the cause of the Therac - 25 computer-controlled radiation therapy machine accidents, which...United States General Accounting Office, Washington, DC, May 2000. [Lev95] Leveson, N., Medical Devices: The Therac -25, Software Engineering Re

  8. A systematic literature review of open source software quality assessment models.

    Science.gov (United States)

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  9. How Modeling Standards, Software, and Initiatives Support Reproducibility in Systems Biology and Systems Medicine.

    Science.gov (United States)

    Waltemath, Dagmar; Wolkenhauer, Olaf

    2016-10-01

    Only reproducible results are of significance to science. The lack of suitable standards and appropriate support of standards in software tools has led to numerous publications with irreproducible results. Our objectives are to identify the key challenges of reproducible research and to highlight existing solutions. In this paper, we summarize problems concerning reproducibility in systems biology and systems medicine. We focus on initiatives, standards, and software tools that aim to improve the reproducibility of simulation studies. The long-term success of systems biology and systems medicine depends on trustworthy models and simulations. This requires openness to ensure reusability and transparency to enable reproducibility of results in these fields.

  10. Proposed Robot Scheme with 5 DoF and Dynamic Modelling Using Maple Software

    OpenAIRE

    Shala Ahmet; Bruçi Mirlind

    2017-01-01

    In this paper is represented Dynamical Modelling of robots which is commonly first important step of Modelling, Analysis and Control of robotic systems. This paper is focused on using Denavit-Hartenberg (DH) convention for kinematics and Newton-Euler Formulations for dynamic modelling of 5 DoF - Degree of Freedom of 3D robot. The process of deriving of dynamical model is done using Software Maple. Derived Dynamical Model of 5 DoF robot is converted for Matlab use for future analysis, control ...

  11. Proposed Robot Scheme with 5 DoF and Dynamic Modelling Using Maple Software

    Directory of Open Access Journals (Sweden)

    Shala Ahmet

    2017-11-01

    Full Text Available In this paper is represented Dynamical Modelling of robots which is commonly first important step of Modelling, Analysis and Control of robotic systems. This paper is focused on using Denavit-Hartenberg (DH convention for kinematics and Newton-Euler Formulations for dynamic modelling of 5 DoF - Degree of Freedom of 3D robot. The process of deriving of dynamical model is done using Software Maple. Derived Dynamical Model of 5 DoF robot is converted for Matlab use for future analysis, control and simulations.

  12. Functional modelling for integration of human-software-hardware in complex physical systems

    International Nuclear Information System (INIS)

    Modarres, M.

    1996-01-01

    A framework describing the properties of complex physical systems composed of human-software-hardware interactions in terms of their functions is described. It is argued that such a framework is domain-general, so that functional primitives present a language that is more general than most other modeling methods such as mathematical simulation. The characteristics and types of functional models are described. Examples of uses of the framework in modeling physical systems composed of human-software-hardware (hereby we refer to them as only physical systems) are presented. It is concluded that a function-centered model of a physical system provides a capability for generating a high-level simulation of the system for intelligent diagnostic, control or other similar applications

  13. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  14. Modelling the critical success factors of agile software development projects in South Africa

    Directory of Open Access Journals (Sweden)

    Tawanda B. Chiyangwa

    2017-10-01

    Full Text Available Background: The continued in failure of agile and traditional software development projects have led to the consideration, attention and dispute to critical success factors that are the aspects which are most vital to make a software engineering methodology fruitful. Although there is an increasing variety of critical success factors and methodologies, the conceptual frameworks which have causal relationship are limited. Objective: The objective of this study was to identify and provide insights into the critical success factors that influence the success of software development projects using agile methodologies in South Africa. Method: Quantitative method of collecting data was used. Data were collected in South Africa through a Web-based survey using structured questionnaires. Results: These results show that organisational factors have a great influence on performance expectancy characteristics. Conclusion: The results of this study discovered a comprehensive model that could provide guidelines to the agile community and to the agile professionals.

  15. Software engineering and Ada (Trademark) training: An implementation model for NASA

    Science.gov (United States)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  16. THE MODEL OF FORMATION OF RESEARCH COMPETENCE OF FUTURE SOFTWARE ENGINEERS

    Directory of Open Access Journals (Sweden)

    N. Osipova

    2014-06-01

    Full Text Available The article analyzes the practical experience, theoretical and methodological backgrounds toformation of research competence of future software engineers. Also in this article we defined the content, structure, criteria and indicators of research competence of future software engineers and characterized levels of the formedness of research competence of futuresoftware engineers and explained main phases of its formation. In consideration of the specificity of formation of research competence of software engineers, job market requirements and social order much attention in article is paid to student participation in research projects of the Chair, particularly in international projects and projects commissioned by the Ministry of Education and Science of Ukraine. The important factor of effective formation of research competence of future software engineers is student's work on chair of scientific schools, their training in IT companies and IT departments of higher education institutions and other educational establishments, including abroad. Also we pay attention to the need of group work of participants of the educational process that can beprovided with their participation in scientific problem groups, scientific schools, work on joint research projects. The conducted research confirms the effectiveness of implementation of the proposed model offormation of research competence of future software engineers.

  17. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  18. Software for occupational health and safety risk analysis based on a fuzzy model.

    Science.gov (United States)

    Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan

    2012-01-01

    Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.

  19. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  20. Software tools for object-based audio production using the Audio Definition Model

    OpenAIRE

    Matthias , Geier; Carpentier , Thibaut; Noisternig , Markus; Warusfel , Olivier

    2017-01-01

    International audience; We present a publicly available set of tools for the integration of the Audio Definition Model (ADM) in production workflows. ADM is an open metadata model for the description of channel-, scene-, and object-based media within a Broadcast Wave Format (BWF) container. The software tools were developed within the European research project ORPHEUS (https://orpheus-audio.eu/) that aims at developing new end-to-end object-based media chains for broadcast. These tools allow ...

  1. Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support

    Science.gov (United States)

    Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)

    1999-01-01

    Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.

  2. Estimating Model Parameters of Adaptive Software Systems in Real-Time

    Science.gov (United States)

    Kumar, Dinesh; Tantawi, Asser; Zhang, Li

    Adaptive software systems have the ability to adapt to changes in workload and execution environment. In order to perform resource management through model based control in such systems, an accurate mechanism for estimating the software system's model parameters is required. This paper deals with real-time estimation of a performance model for adaptive software systems that process multiple classes of transactional workload. First, insights in to the static performance model estimation problem are provided. Then an Extended Kalman Filter (EKF) design is combined with an open queueing network model to dynamically estimate the model parameters in real-time. Specific problems that are encountered in the case of multiple classes of workload are analyzed. These problems arise mainly due to the under-deterministic nature of the estimation problem. This motivates us to propose a modified design of the filter. Insights for choosing tuning parameters of the modified design, i.e., number of constraints and sampling intervals are provided. The modified filter design is shown to effectively tackle problems with multiple classes of workload through experiments.

  3. THE MODELS OF LEARNING SOFTWARE ACCESS IN THE CLOUD BASED EDUCATIONAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    M. Shyshkina

    2015-02-01

    Full Text Available In the modern information-educational environment there are new models of learning activity based on innovative technological solutions for the organization of the environment infrastructure , which include the cloud-oriented solutions. The problems of the institution information technological infrastructure setup for the needs of users, the organization of tools and services of that environment so as to allow maximum pedagogical advantage of modern ICT to achieve improvement of learning outcomes and the processes of scientific and educational activities organization, require justification of the ways to provide access to software and electronic educational resources. The article outlines the conceptual framework of the study and review existing approaches, tools, model of a cloud-based environment, their advantages and disadvantages, experience of use. The models of environment architecture, the peculiarities of its pedagogical application are exposed. The hybrid service model of access to software for educational purposes is grounded. The aim of research is an analysis of thy modern approaches to the formation of the cloud-based learning environment of the institution on the basis of different types of service models; justification of the hybrid service model of the learning software access.

  4. GEMSiRV: a software platform for GEnome-scale metabolic model simulation, reconstruction and visualization.

    Science.gov (United States)

    Liao, Yu-Chieh; Tsai, Ming-Hsin; Chen, Feng-Chi; Hsiung, Chao A

    2012-07-01

    Genome-scale metabolic network models have become an indispensable part of the increasingly important field of systems biology. Metabolic systems biology studies usually include three major components-network model construction, objective- and experiment-guided model editing and visualization, and simulation studies based mainly on flux balance analyses. Bioinformatics tools are required to facilitate these complicated analyses. Although some of the required functions have been served separately by existing tools, a free software resource that simultaneously serves the needs of the three major components is not yet available. Here we present a software platform, GEMSiRV (GEnome-scale Metabolic model Simulation, Reconstruction and Visualization), to provide functionalities of easy metabolic network drafting and editing, amenable network visualization for experimental data integration and flux balance analysis tools for simulation studies. GEMSiRV comes with downloadable, ready-to-use public-domain metabolic models, reference metabolite/reaction databases and metabolic network maps, all of which can be input into GEMSiRV as the starting materials for network construction or simulation analyses. Furthermore, all of the GEMSiRV-generated metabolic models and analysis results, including projects in progress, can be easily exchanged in the research community. GEMSiRV is a powerful integrative resource that may facilitate the development of systems biology studies. The software is freely available on the web at http://sb.nhri.org.tw/GEMSiRV.

  5. Design of complete software GPS signal simulator with low complexity and precise multipath channel model

    Directory of Open Access Journals (Sweden)

    G. Arul Elango

    2016-09-01

    Full Text Available The need for GPS data simulators have become important due to the tremendous growth in the design of versatile GPS receivers. Commercial hardware and software based GPS simulators are expensive and time consuming. In this work, a low cost simple novel GPS L1 signal simulator is designed for testing and evaluating the performance of software GPS receiver in a laboratory environment. A typical real time paradigm, similar to actual satellite derived GPS signal is created on a computer generated scenario. In this paper, a GPS software simulator is proposed that may offer a lot of analysis and testing flexibility to the researchers and developers as it is totally software based primarily running on a laptop/personal computer without the requirement of any hardware. The proposed GPS simulator allows provision for re-configurability and test repeatability and is developed in VC++ platform to minimize the simulation time. It also incorporates Rayleigh multipath channel fading model under non-line of sight (NLOS conditions. In this work, to efficiently design the simulator, several Rayleigh fading models viz. Inverse Discrete Fourier Transform (IDFT, Filtering White Gaussian Noise (FWFN and modified Sum of Sinusoidal (SOS simulators are tested and compared in terms of accuracy of its first and second order statistical metrics, execution time and the later one is found to be as the best appropriate Rayleigh multipath model suitable for incorporating with GPS simulator. The fading model written in ‘MATLAB’ engine has been linked with software GPS simulator module enable to test GPS receiver’s functionality in different fading environments.

  6. EMMC guidance on quality assurance for academic materials modelling software engineering

    OpenAIRE

    European Materials Modelling Council

    2015-01-01

    Proposed recommendations for software development in LEIT projects. This document presents the advice of software owners, commercial and academic, on what academic software could do to generate better quality software, ready to be used by third parties.

  7. Surface models of the male urogenital organs built from the Visible Korean using popular software

    Science.gov (United States)

    Shin, Dong Sun; Park, Jin Seo; Shin, Byeong-Seok

    2011-01-01

    Unlike volume models, surface models, which are empty three-dimensional images, have a small file size, so they can be displayed, rotated, and modified in real time. Thus, surface models of male urogenital organs can be effectively applied to an interactive computer simulation and contribute to the clinical practice of urologists. To create high-quality surface models, the urogenital organs and other neighboring structures were outlined in 464 sectioned images of the Visible Korean male using Adobe Photoshop; the outlines were interpolated on Discreet Combustion; then an almost automatic volume reconstruction followed by surface reconstruction was performed on 3D-DOCTOR. The surface models were refined and assembled in their proper positions on Maya, and a surface model was coated with actual surface texture acquired from the volume model of the structure on specially programmed software. In total, 95 surface models were prepared, particularly complete models of the urinary and genital tracts. These surface models will be distributed to encourage other investigators to develop various kinds of medical training simulations. Increasingly automated surface reconstruction technology using commercial software will enable other researchers to produce their own surface models more effectively. PMID:21829759

  8. Modeling density-driven flow in porous media principles, numerics, software

    CERN Document Server

    Holzbecher, Ekkehard O

    1998-01-01

    Modeling of flow and transport in groundwater has become an important focus of scientific research in recent years. Most contributions to this subject deal with flow situations, where density and viscosity changes in the fluid are neglected. This restriction may not always be justified. The models presented in the book demonstrate immpressingly that the flow pattern may be completely different when density changes are taken into account. The main applications of the models are: thermal and saline convection, geothermal flow, saltwater intrusion, flow through salt formations etc. This book not only presents basic theory, but the reader can also test his knowledge by applying the included software and can set up own models.

  9. Conceptual study of the application software manager using the Xlet model in the nuclear fields

    International Nuclear Information System (INIS)

    Lee, Joon Koo; Park, Heui Youn; Koo, In Soo; Park, Hee Seok; Kim, Jung Seon; Sohn, Chang Ho

    2003-01-01

    In order to reduce the cost of software maintenance including software modification, we suggest the object oriented program with checking the version of application program using the Java language and the technique of executing the downloaded application program via network using the application manager. In order to change the traditional scheduler to the application manager we have adopted the Xlet concept in the nuclear fields using the network. In usual Xlet means a Java application that runs on the digital television receiver. The Java TV Application Program Interface(API) defines an application model called the Xlet application lifecycle. Java applications that use this lifecycle model are called Xlets. The Xlet application lifecycle is compatible with the existing application environment and virtual machine technology. The Xlet application lifecycle model defines the dialog (protocol) between an Xlet and its environment

  10. Program SPACECAP: software for estimating animal density using spatially explicit capture-recapture models

    Science.gov (United States)

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas

    2012-01-01

    1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.

  11. Issues and Challenges of Business Rules Modeling in Software Systems for Business Management

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2014-01-01

    Full Text Available Software systems for business management appeared as a result of the growing need to ensure a consistent IT support for most of the business activities that organizations have to deal with. Moreover, organizations continue to struggle for obtaining competitive advantages on the business market and to lower the cost of developing and maintaining computer systems to support their operations. As business rules play an important role within any organization, they should be taken into consideration as distinct elements when developing a software system that will operate in a collaborative environment. The paper addresses the problem of business rules modeling, with special emphasis on incorporating business rules in Unified Modeling Language (UML models.

  12. A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.

    Science.gov (United States)

    Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.

    The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.

  13. Evaluation of the finite element software ABAQUS for biomechanical modelling of biphasic tissues.

    Science.gov (United States)

    Wu, J Z; Herzog, W; Epstein, M

    1998-02-01

    The biphasic cartilage model proposed by Mow et al. (1980) has proven successful to capture the essential mechanical features of articular cartilage. In order to analyse the joint contact mechanics in real, anatomical joints, the cartilage model needs to be implemented into a suitable finite element code to approximate the irregular surface geometries of such joints. However, systematic and extensive evaluation of the capacity of commercial software for modelling the contact mechanics with biphasic cartilage layers has not been made. This research was aimed at evaluating the commercial finite element software ABAQUS for analysing biphasic soft tissues. The solutions obtained using ABAQUS were compared with those obtained using other finite element models and analytical solutions for three numerical tests: an unconfined indentation test, a test with the contact of a spherical cartilage surface with a rigid plate, and an axi-symmetric joint contact test. It was concluded that the biphasic cartilage model can be implemented into the commercial finite element software ABAQUS to analyse practical joint contact problems with biphasic articular cartilage layers.

  14. Evolving a Simulation Model Product Line Software Architecture from Heterogeneous Model Representations

    National Research Council Canada - National Science Library

    Greaney, Kevin

    2003-01-01

    .... Many of these large-scale, software-intensive simulation systems were autonomously developed over time, and subject to varying degrees of funding, maintenance, and life-cycle management practices...

  15. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  16. A Review Of CEM: Customer Engagement as Innovation Co-Creator

    Directory of Open Access Journals (Sweden)

    Elidjen Elidjen

    2013-12-01

    Full Text Available Competition is very tight causing companies looking for a competitive edge, both in the product packaging and in maintaining good relations with their customers. The management of good relationship is commonly referred to Customer Relationship Management (CRM. In general, CRM is focused on how to market something to customers and obtain value from them by using information technology. However, it ignores customers’ insight that can provide added value to the company's profits. That is what causes the need for Customer Experience Management (CEM to handle the experience of customers to improve value for customers so that customers become loyal. More useful definition of CEM is handling customer interactions to build brand equity and increase the long-term profitability. The five-element approach known as SMART (strategy, metrics; alignment, redesign and technology has a positive impact for the company. In the end customers can actualize themselves, through a company's brand and products.

  17. Practical Findings from Applying the PSD Model for Evaluating Software Design Specifications

    Science.gov (United States)

    Räisänen, Teppo; Lehto, Tuomas; Oinas-Kukkonen, Harri

    This paper presents practical findings from applying the PSD model to evaluating the support for persuasive features in software design specifications for a mobile Internet device. On the one hand, our experiences suggest that the PSD model fits relatively well for evaluating design specifications. On the other hand, the model would benefit from more specific heuristics for evaluating each technique to avoid unnecessary subjectivity. Better distinction between the design principles in the social support category would also make the model easier to use. Practitioners who have no theoretical background can apply the PSD model to increase the persuasiveness of the systems they design. The greatest benefit of the PSD model for researchers designing new systems may be achieved when it is applied together with a sound theory, such as the Elaboration Likelihood Model. Using the ELM together with the PSD model, one may increase the chances for attitude change.

  18. Health Technology Assessment of CEM Pulpotomy in Permanent Molars with Irreversible Pulpitis

    OpenAIRE

    Yazdani, Shahram; Jadidfard, Mohammad-Pooyan; Tahani, Bahareh; Kazemian, Ali; Dianat, Omid; Alim Marvasti, Laleh

    2013-01-01

    Introduction: Teeth with irreversible pulpitis usually undergo root canal therapy (RCT). This treatment modality is often considered disadvantageous as it removes vital pulp tissue and weakens the tooth structure. A relatively new concept has risen which suggests vital pulp therapy (VPT) for irreversible pulpitis. VPT with calcium enriched mixture (VPT/CEM) has demonstrated favorable treatment outcomes when treating permanent molars with irreversible pulpitis. This study aims to compare patie...

  19. Cem Dias sem Bush: o Partido Republicano, o Governo Obama e o Futuro

    Directory of Open Access Journals (Sweden)

    Cristina Soreanu Pecequilo

    2009-05-01

    Full Text Available

    O artigo examina a presente situação

    do partido republicano nos cem primeiros dias do

    governo Obama e suas perspectivas.

  20. An expandable software model for collaborative decision making during the whole building life cycle

    Energy Technology Data Exchange (ETDEWEB)

    Papamichael, K.; Pal, V.; Bourassa, N.; Loffeld, J.; Capeluto, G.

    2000-04-01

    Decisions throughout the life cycle of a building, from design through construction and commissioning to operation and demolition, require the involvement of multiple interested parties (e.g., architects, engineers, owners, occupants and facility managers). The performance of alternative designs and courses of action must be assessed with respect to multiple performance criteria, such as comfort, aesthetics, energy, cost and environmental impact. Several stand-alone computer tools are currently available that address specific performance issues during various stages of a building's life cycle. Some of these tools support collaboration by providing means for synchronous and asynchronous communications, performance simulations, and monitoring of a variety of performance parameters involved in decisions about a building during building operation. However, these tools are not linked in any way, so significant work is required to maintain and distribute information to all parties. In this paper we describe a software model that provides the data management and process control required for collaborative decision making throughout a building's life cycle. The requirements for the model are delineated addressing data and process needs for decision making at different stages of a building's life cycle. The software model meets these requirements and allows addition of any number of processes and support databases over time. What makes the model infinitely expandable is that it is a very generic conceptualization (or abstraction) of processes as relations among data. The software model supports multiple concurrent users, and facilitates discussion and debate leading to decision making. The software allows users to define rules and functions for automating tasks and alerting all participants to issues that need attention. It supports management of simulated as well as real data and continuously generates information useful for improving performance prediction

  1. An expandable software model for collaborative decision making during the whole building life cycle

    International Nuclear Information System (INIS)

    Papamichael, K.; Pal, V.; Bourassa, N.; Loffeld, J.; Capeluto, G.

    2000-01-01

    Decisions throughout the life cycle of a building, from design through construction and commissioning to operation and demolition, require the involvement of multiple interested parties (e.g., architects, engineers, owners, occupants and facility managers). The performance of alternative designs and courses of action must be assessed with respect to multiple performance criteria, such as comfort, aesthetics, energy, cost and environmental impact. Several stand-alone computer tools are currently available that address specific performance issues during various stages of a building's life cycle. Some of these tools support collaboration by providing means for synchronous and asynchronous communications, performance simulations, and monitoring of a variety of performance parameters involved in decisions about a building during building operation. However, these tools are not linked in any way, so significant work is required to maintain and distribute information to all parties. In this paper we describe a software model that provides the data management and process control required for collaborative decision making throughout a building's life cycle. The requirements for the model are delineated addressing data and process needs for decision making at different stages of a building's life cycle. The software model meets these requirements and allows addition of any number of processes and support databases over time. What makes the model infinitely expandable is that it is a very generic conceptualization (or abstraction) of processes as relations among data. The software model supports multiple concurrent users, and facilitates discussion and debate leading to decision making. The software allows users to define rules and functions for automating tasks and alerting all participants to issues that need attention. It supports management of simulated as well as real data and continuously generates information useful for improving performance prediction and

  2. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    Science.gov (United States)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  3. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    Science.gov (United States)

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  4. Recent advances in molecular epidemiology and detection of Taylorella equigenitalis associated with contagious equine metritis (CEM).

    Science.gov (United States)

    Matsuda, Motoo; Moore, John E

    2003-12-02

    In the present review article, recent molecular advances relating to studies with Taylorella equigenitalis, as well as the recently described second species of the genus Taylorella, namely Taylorella asinigenitalis, have been described. Molecular genotyping of T. equigenitalis strains by pulsed-field gel electrophoresis (PFGE) after digestion with the suitable restriction enzyme(s) enabled the effective discrimination of strains, thus allowing the examination of the scientific mechanism(s) for its occurrence and transmission of contagious equine metritis (CEM). Alternatively, polymerase chain reaction (PCR) amplification and nucleotide sequencing of the 16S ribosomal DNA sequence and/or the other species specific sequence(s) as targets were confirmed to be effective for identification of T. equigenitalis. These new analytical methods at the genomic DNA level also enabled the discrimination of the newly discovered donkey-related T. asinigenitalis from T. equigenitalis, and moreover, the performance of phylogenetic analysis of genus Taylorella organisms with other closely related genera. Furthermore, detailed analysis of the genes responsible for CEM within the T. equigenitalis genome would be useful to help elucidate the pathogenic virulence and transmission mechanisms associated with the important equine pathogen associated with CEM.

  5. Recent advances in CE-MS: Synergy of wet chemistry and instrumentation innovations.

    Science.gov (United States)

    Pantůčková, Pavla; Gebauer, Petr; Boček, Petr; Křivánková, Ludmila

    2011-01-01

    CE with MS detection is a hyphenated technique which greatly improves the ability of CE to deal with real samples, especially with those coming from biology and medicine, where the target analytes are present as trace amounts in very complex matrices. CE-MS is now almost a routine technique performed on commercially available instruments. It faces currently a tremendous development of the technique itself as well as of its wide application area. Great interest in CE-MS is reflected in the scientific literature by many original research articles and also by numerous reviews. The review presented here has a general scope and belongs to a series of regularly published reviews on the topic. It covers the literature from the last 2 years, since January 2008 till June 2010. It brings a critical selection of related literature sorted into groups reflecting the main topics of actual scientific interest: (i) innovations in CE-ESI-MS, (ii) use of alternative interfaces, and (iii) ways to enhance sensitivity. Special attention is paid to novel electrolyte systems amenable to CE-MS including nonvolatile BGEs, to advanced CE separation principles such as MEKC, MEEKC, chiral CE, and to the use of preconcentration techniques. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. CEMS Investigations of Fe-Silicide Phases Formed by the Method of Concentration Controlled Phase Selection

    International Nuclear Information System (INIS)

    Moodley, M. K.; Bharuth-Ram, K.; Waal, H. de; Pretorius, R.

    2002-01-01

    Conversion electron Moessbauer spectroscopy (CEMS) measurements have been made on Fe-silicide samples formed using the method of concentration controlled phase selection. To prepare the samples a 10 nm layer of Fe 30 M 70 (M=Cr, Ni) was evaporated onto Si(100) surfaces, followed by evaporation of a 60 nm Fe layer. Diffusion of the Fe into the Si substrate and the formation of different Fe-Si phases was achieved by subjecting the evaporated samples to a series of heating stages, which consisted of (a) a 10 min anneal at 800 deg. C plus etch of the residual surface layer, (b) a further 3 hr anneal at 800 deg. C, (c) a 60 mJ excimer laser anneal to an energy density of 0.8 J/cm 2 , and (d) a final 3 hr anneal at 800 deg. C. CEMS measurements were used to track the Fe-silicide phases formed. The CEMS spectra consisted of doublets which, based on established hyperfine parameters, could be assigned to α- or β-FeSi 2 or cubic FeSi. The spectra showed that β-FeSi 2 had formed already at the first annealing stage. Excimer laser annealing resulted in the formation of a phase with hyperfine parameters consistent with those of α-FeSi 2 . A further 3 hr anneal at 800 deg. C resulted in complete reversal to the semiconducting β-FeSi 2 phase.

  7. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    Science.gov (United States)

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society

  8. ORTH: R and SAS software for regression models of correlated binary data based on orthogonalized residuals and alternating logistic regressions.

    Science.gov (United States)

    By, Kunthel; Qaqish, Bahjat F; Preisser, John S; Perin, Jamie; Zink, Richard C

    2014-02-01

    This article describes a new software for modeling correlated binary data based on orthogonalized residuals, a recently developed estimating equations approach that includes, as a special case, alternating logistic regressions. The software is flexible with respect to fitting in that the user can choose estimating equations for association models based on alternating logistic regressions or orthogonalized residuals, the latter choice providing a non-diagonal working covariance matrix for second moment parameters providing potentially greater efficiency. Regression diagnostics based on this method are also implemented in the software. The mathematical background is briefly reviewed and the software is applied to medical data sets. Published by Elsevier Ireland Ltd.

  9. On the Use of Variability Operations in the V-Modell XT Software Process Line

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel; Ternité, Thomas

    2016-01-01

    process assets. Variability operations are an instrument to realize flexibility by explicitly declaring required modifications, which are applied to create a procedurally generated company-specific process. However, little is known about which variability operations are suitable in practice......Software process lines provide a systematic approach to develop and manage software processes. It defines a reference process containing general process assets, whereas a well-defined customization approach allows process engineers to create new process variants, e.g., by extending or modifying...... as an improvement proposal for other process models. Our findings show that 69 variability operation types are defined across several metamodel versions of which, however, 25 remain unused. The found variability operations allow for systematically modifying the content of process model elements and the process...

  10. Recent developments on PLASMAKIN - a software package to model the kinetics in gas discharges

    International Nuclear Information System (INIS)

    Pinhao, N R

    2009-01-01

    PLASMAKIN is a user-friendly software package to handle physical and chemical data used in plasma physics modeling and to compute the production and destruction terms in fluid models equations. These terms account for the particle or energy production and loss rates due to gas-phase and gas-surface reactions. The package has been restructured and expanded to (a) allow the simulation of atomic emission spectra taking into account line broadening processes and radiation trapping; (b) include a library to compute the electron kinetics; (c) include a database of species properties and reactions and, (d) include a Python interface to allow access from scripts and integration with other scientific software tools.

  11. Mobile Agent-Based Software Systems Modeling Approaches: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Aissam Belghiat

    2016-06-01

    Full Text Available Mobile agent-based applications are special type of software systems which take the advantages of mobile agents in order to provide a new beneficial paradigm to solve multiple complex problems in several fields and areas such as network management, e-commerce, e-learning, etc. Likewise, we notice lack of real applications based on this paradigm and lack of serious evaluations of their modeling approaches. Hence, this paper provides a comparative study of modeling approaches of mobile agent-based software systems. The objective is to give the reader an overview and a thorough understanding of the work that has been done and where the gaps in the research are.

  12. A Technology-Neutral Role-Based Collaboration Model for Software Ecosystems

    DEFF Research Database (Denmark)

    Stanciulescu, Stefan; Rabiser, Daniela; Seidl, Christoph

    2016-01-01

    by contributing a role-based collaboration model for software ecosystems to make such implicit similarities explicit and to raise awareness among developers during their ongoing efforts. We extract this model based on realization artifacts in a specific programming language located in a particular source code...... efforts and information of ongoing development efforts. Finally, using the collaborations defined in the formalism we model real artifacts from Marlin, a firmware for 3D printers, and we show that for the selected scenarios, the five collaborations were sufficient to raise awareness and make implicit...

  13. Use of the EMTP-ATP Software to Develop a Dynamic Model of the Technological Centre

    Directory of Open Access Journals (Sweden)

    Tomas Mozdren

    2016-01-01

    Full Text Available This paper deals with analysis of power generating units installed within the technological centre. To be able to analyse behaviour of such a complex system with accumulation, the dynamic model of the technology was created using the EMTP-ATP software. The current configuration of the dynamic model is based on the block diagram containing all the unconventional sources of electric power. The values produced by ATPDraw are shown in graphs for reference. The dynamic model will serve the purpose of research and observation of the entire technological centre with respect to transients at individual sources of power.

  14. Igpet software for modeling igneous processes: examples of application using the open educational version

    Science.gov (United States)

    Carr, Michael J.; Gazel, Esteban

    2017-04-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  15. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  16. Enhanced Bayesian modelling in BAPS software for learning genetic structures of populations

    Directory of Open Access Journals (Sweden)

    Sirén Jukka

    2008-12-01

    Full Text Available Abstract Background During the most recent decade many Bayesian statistical models and software for answering questions related to the genetic structure underlying population samples have appeared in the scientific literature. Most of these methods utilize molecular markers for the inferences, while some are also capable of handling DNA sequence data. In a number of earlier works, we have introduced an array of statistical methods for population genetic inference that are implemented in the software BAPS. However, the complexity of biological problems related to genetic structure analysis keeps increasing such that in many cases the current methods may provide either inappropriate or insufficient solutions. Results We discuss the necessity of enhancing the statistical approaches to face the challenges posed by the ever-increasing amounts of molecular data generated by scientists over a wide range of research areas and introduce an array of new statistical tools implemented in the most recent version of BAPS. With these methods it is possible, e.g., to fit genetic mixture models using user-specified numbers of clusters and to estimate levels of admixture under a genetic linkage model. Also, alleles representing a different ancestry compared to the average observed genomic positions can be tracked for the sampled individuals, and a priori specified hypotheses about genetic population structure can be directly compared using Bayes' theorem. In general, we have improved further the computational characteristics of the algorithms behind the methods implemented in BAPS facilitating the analyses of large and complex datasets. In particular, analysis of a single dataset can now be spread over multiple computers using a script interface to the software. Conclusion The Bayesian modelling methods introduced in this article represent an array of enhanced tools for learning the genetic structure of populations. Their implementations in the BAPS software are

  17. A Suitable Software Process Improvement Model for the UK Healthcare Industry

    Science.gov (United States)

    Nguyen, Tien D.; Guo, Hong; Naguib, Raouf N. G.

    Over the recent years, the UK Healthcare sector has been the prime focus of many reports and industrial surveys, particularly in the field of software development and management issues. This signals the importance of growing concerns regarding quality issues in the Healthcare domain. In response to this, a new tailored Healthcare Process Improvement (SPI) model is proposed, which takes into consideration both signals from the industry and insights from literature.

  18. Efficient and stable perfectly matched layer for CEM

    KAUST Repository

    Duru, Kenneth

    2014-02-01

    An efficient unsplit perfectly matched layer for numerical simulation of electromagnetic waves in unbounded domains is derived via a complex change of variables. In order to surround a Cartesian grid with the PML, the time-dependent PML requires only one (scalar) auxiliary variable in two space dimensions and six (scalar) auxiliary variables in three space dimensions. It is therefore cheap and straightforward to implement. We use Fourier and energy methods to prove the stability of the PML. We extend the stability result to a semi-discrete PML approximated by central finite differences of arbitrary order of accuracy and to a fully discrete problem for the \\'Leap-Frog\\' schemes. This makes precise the usefulness of the derived PML model for longtime simulations. Numerical experiments are presented, illustrating the accuracy and stability of the PML. © 2013 IMACS.

  19. An Open Software Platform for Sharing Water Resource Models, Code and Data

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon

    2016-04-01

    The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com

  20. Using Mathematica software for coal gasification simulations – Selected kinetic model application

    Directory of Open Access Journals (Sweden)

    Sebastian Iwaszenko

    2015-01-01

    Full Text Available Coal gasification is recognized as a one of promising Clean Coal Technologies. As the process itself is complicated and technologically demanding, it is subject of many research. In the paper a problem of using volumetric, non-reactive core and Johnson model for coal gasification and underground coal gasification is considered. The usage of Mathematica software for models' equations solving and analysis is presented. Coal parameters were estimated for five Polish mines: Piast, Ziemowit, Janina, Szczygłowice and Bobrek. For each coal the models' parameters were determined. The determination of parameters was based on reactivity assessment for 50% char conversion. The calculations show relatively small differences between conversion predicted by volumetric and non reactive core model. More significant differences were observed for Johnson model, but they do not exceeded 10% for final char conversion. The conceptual model for underground coal gasification was presented.

  1. V and V-based remaining fault estimation model for safety–critical software of a nuclear power plant

    International Nuclear Information System (INIS)

    Eom, Heung-seop; Park, Gee-yong; Jang, Seung-cheol; Son, Han Seong; Kang, Hyun Gook

    2013-01-01

    Highlights: ► A software fault estimation model based on Bayesian Nets and V and V. ► Use of quantified data derived from qualitative V and V results. ► Faults insertion and elimination process was modeled in the context of probability. ► Systematically estimates the expected number of remaining faults. -- Abstract: Quantitative software reliability measurement approaches have some limitations in demonstrating the proper level of reliability in cases of safety–critical software. One of the more promising alternatives is the use of software development quality information. Particularly in the nuclear industry, regulatory bodies in most countries use both probabilistic and deterministic measures for ensuring the reliability of safety-grade digital computers in NPPs. The point of deterministic criteria is to assess the whole development process and its related activities during the software development life cycle for the acceptance of safety–critical software. In addition software Verification and Validation (V and V) play an important role in this process. In this light, we propose a V and V-based fault estimation method using Bayesian Nets to estimate the remaining faults for safety–critical software after the software development life cycle is completed. By modeling the fault insertion and elimination processes during the whole development phases, the proposed method systematically estimates the expected number of remaining faults.

  2. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Nielsen Lars K

    2009-05-01

    Full Text Available Abstract Background The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i tracer cultivation on 13C substrates, (ii 13C labelling analysis by mass spectrometry and (iii mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. Results We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly ( Conclusion We have developed a fast, accurate application to perform steady-state 13C metabolic flux analysis. OpenFLUX will strongly facilitate and

  3. Using COMSOL Multiphysics Software to Analyze the Thin Film Resistance Model of a Conductor on PET

    Science.gov (United States)

    Carradero-Santiago, Carolyn; Merced-Sanabria, Milzaida; Vedrine-Pauléus, Josee

    2015-03-01

    In this research work, we will develop a virtual model to analyze the electrical conductivity of a thin film with three layers, one of graphene or conducting metal film, polyethylene terephthalate (PET) and Poly(3,4-ethylenedioxythiophene) Polystyrene sulfonate (PEDOT:PSS). COMSOL Multiphysics will be the software use to develop the virtual model to analyze the thin-film layers. COMSOL software allows simulation and modelling of physical phenomena represented by differential equations such as that of heat transfer, fluid movement, electromagnetism and structural mechanics. In the work, we will define the geometry of the model; in this case we want three layers-PET, the conducting layer and PEDOT:PSS. We will then add the materials and assign PET as the lower layer, the above conductor as the middle layer and the PEDOT:PSS as the upper layer. We will analyze the model with varying thickness of the top conducting layer. This simulation will allow us to analyze the electrical conductivity, and visualize the model with varying voltage potential, or bias across the plates.

  4. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    Science.gov (United States)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network

  5. GSEVM v.2: MCMC software to analyse genetically structured environmental variance models

    DEFF Research Database (Denmark)

    Ibáñez-Escriche, N; Garcia, M; Sorensen, D

    2010-01-01

    This note provides a description of software that allows to fit Bayesian genetically structured variance models using Markov chain Monte Carlo (MCMC). The gsevm v.2 program was written in Fortran 90. The DOS and Unix executable programs, the user's guide, and some example files are freely available...... for research purposes at http://www.bdporc.irta.es/estudis.jsp. The main feature of the program is to compute Monte Carlo estimates of marginal posterior distributions of parameters of interest. The program is quite flexible, allowing the user to fit a variety of linear models at the level of the mean...

  6. Software-defined networking model for smart transformers with ISO/IEC/IEEE 21451 sensors

    Directory of Open Access Journals (Sweden)

    Longhua Guo

    2017-06-01

    Full Text Available The advanced IEC 61850 smart transformer has shown an improved performance in monitoring, controlling, and protecting the equipment in smart substations. However, heterogeneity, feasibility, and network control problems have limited the smart transformer’s performance in networks. To address these issues, a software-defined networking model was proposed using ISO/IEC/IEEE 21451 networks. An IEC-61850-based network controller was designed as a new kind of intelligent electrical device (IED. The proposed data and information models enhanced the network awareness ability and facilitated the access of smart sensors in transformer to communication networks. The performance evaluation results showed an improved efficiency.

  7. Application of the AHP method in modeling the trust and reputation of software agents

    Science.gov (United States)

    Zytniewski, Mariusz; Klementa, Marek; Skorupka, Dariusz; Stanek, Stanislaw; Duchaczek, Artur

    2016-06-01

    Given the unique characteristics of cyberspace and, in particular, the number of inherent security threats, communication between software agents becomes a highly complex issue and a major challenge that, on the one hand, needs to be continuously monitored and, on the other, awaits new solutions addressing its vulnerabilities. An approach that has recently come into view mimics mechanisms typical of social systems and is based on trust and reputation that assist agents in deciding which other agents to interact with. The paper offers an enhancement to existing trust and reputation models, involving the application of the AHP method that is widely used for decision support in social systems, notably for risks analysis. To this end, it is proposed to expand the underlying conceptual basis by including such notions as self-trust and social trust, and to apply these to software agents. The discussion is concluded with an account of an experiment aimed at testing the effectiveness of the proposed solution.

  8. New software library of geometrical primitives for modelling of solids used in Monte Carlo detector simulations

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    We present our effort for the creation of a new software library of geometrical primitives, which are used for solid modelling in Monte Carlo detector simulations. We plan to replace and unify current geometrical primitive classes in the CERN software projects Geant4 and ROOT with this library. Each solid is represented by a C++ class with methods suited for measuring distances of particles from the surface of a solid and for determination as to whether the particles are located inside, outside or on the surface of the solid. We use numerical tolerance for determining whether the particles are located on the surface. The class methods also contain basic support for visualization. We use dedicated test suites for validation of the shape codes. These include also special performance and numerical value comparison tests for help with analysis of possible candidates of class methods as well as to verify that our new implementation proposals were designed and implemented properly. Currently, bridge classes are u...

  9. Software package for modeling spin-orbit motion in storage rings

    Science.gov (United States)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  10. NASAL-Geom, a free upper respiratory tract 3D model reconstruction software

    Science.gov (United States)

    Cercos-Pita, J. L.; Cal, I. R.; Duque, D.; de Moreta, G. Sanjuán

    2018-02-01

    The tool NASAL-Geom, a free upper respiratory tract 3D model reconstruction software, is here described. As a free software, researchers and professionals are welcome to obtain, analyze, improve and redistribute it, potentially increasing the rate of development, and reducing at the same time ethical conflicts regarding medical applications which cannot be analyzed. Additionally, the tool has been optimized for the specific task of reading upper respiratory tract Computerized Tomography scans, and producing 3D geometries. The reconstruction process is divided into three stages: preprocessing (including Metal Artifact Reduction, noise removal, and feature enhancement), segmentation (where the nasal cavity is identified), and 3D geometry reconstruction. The tool has been automatized (i.e. no human intervention is required) a critical feature to avoid bias in the reconstructed geometries. The applied methodology is discussed, as well as the program robustness and precision.

  11. Cost-effective industrial software rejuvenation using domain-specific models

    NARCIS (Netherlands)

    Mooij, A.J.; Eggen, G.; Hooman, J.; Wezep, H. van

    2015-01-01

    Software maintenance consumes a significant and increasing proportion of industrial software engineering budgets, only to maintain the existing product functionality. This hinders the development of new innovative features with added value to customers. To make software development efforts more

  12. Laser scanner data processing and 3D modeling using a free and open source software

    International Nuclear Information System (INIS)

    Gabriele, Fatuzzo; Michele, Mangiameli; Giuseppe, Mussumeci; Salvatore, Zito

    2015-01-01

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue

  13. A software tool for modification of human voxel models used for application in radiation protection

    International Nuclear Information System (INIS)

    Becker, Janine; Zankl, Maria; Petoussi-Henss, Nina

    2007-01-01

    This note describes a new software tool called 'VolumeChange' that was developed to modify the masses and location of organs of virtual human voxel models. A voxel model is a three-dimensional representation of the human body in the form of an array of identification numbers that are arranged in slices, rows and columns. Each entry in this array represents a voxel; organs are represented by those voxels having the same identification number. With this tool, two human voxel models were adjusted to fit the reference organ masses of a male and a female adult, as defined by the International Commission on Radiological Protection (ICRP). The alteration of an already existing voxel model is a complicated process, leading to many problems that have to be solved. To solve those intricacies in an easy way, a new software tool was developed and is presented here. If the organs are modified, no bit of tissue, i.e. voxel, may vanish nor should an extra one appear. That means that organs cannot be modified without considering the neighbouring tissue. Thus, the principle of organ modification is based on the reassignment of voxels from one organ/tissue to another; actually deleting and adding voxels is only possible at the external surface, i.e. skin. In the software tool described here, the modifications are done by semi-automatic routines but including human control. Because of the complexity of the matter, a skilled person has to validate that the applied changes to organs are anatomically reasonable. A graphical user interface was designed to fulfil the purpose of a comfortable working process, and an adequate graphical display of the modified voxel model was developed. Single organs, organ complexes and even whole limbs can be edited with respect to volume, shape and location. (note)

  14. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  15. Application of the Software as a Service Model to the Control of Complex Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Stadler, Michael; Donadee, Jon; Marnay, Chris; Lai, Judy; Mendes, Goncalo; Appen, Jan von; M& #233; gel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-18

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analyzed.

  16. Application of the Software as a Service Model to the Control of Complex Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Stadler, Michael; Donadee, Jonathan; Marnay, Chris; Mendes, Goncalo; Appen, Jan von; Megel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-17

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analysed.

  17. Experiences with Formal Engineering : Model-Based Specification, Implementation and Testing of a Software Bus at Neopost

    NARCIS (Netherlands)

    Sijtema, Marten; Stoelinga, Mariëlle Ida Antoinette; Belinfante, Axel; Marinelli, Lawrence; Salaün, Gwen; Schätz, Bernhard

    We report on the actual industrial use of formal methods during the development of a software bus. At Neopost Inc., we developed the server component of a software bus, called the XBus, using formal methods during the design, validation and testing phase: We modeled our design of the XBus in the

  18. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  19. Reliable software systems via chains of object models with provably correct behavior

    International Nuclear Information System (INIS)

    Yakhnis, A.; Yakhnis, V.

    1996-01-01

    This work addresses specification and design of reliable safety-critical systems, such as nuclear reactor control systems. Reliability concerns are addressed in complimentary fashion by different fields. Reliability engineers build software reliability models, etc. Safety engineers focus on prevention of potential harmful effects of systems on environment. Software/hardware correctness engineers focus on production of reliable systems on the basis of mathematical proofs. The authors think that correctness may be a crucial guiding issue in the development of reliable safety-critical systems. However, purely formal approaches are not adequate for the task, because they neglect the connection with the informal customer requirements. They alleviate that as follows. First, on the basis of the requirements, they build a model of the system interactions with the environment, where the system is viewed as a black box. They will provide foundations for automated tools which will (a) demonstrate to the customer that all of the scenarios of system behavior are presented in the model, (b) uncover scenarios not present in the requirements, and (c) uncover inconsistent scenarios. The developers will work with the customer until the black box model will not possess scenarios (b) and (c) above. Second, the authors will build a chain of several increasingly detailed models, where the first model is the black box model and the last model serves to automatically generated proved executable code. The behavior of each model will be proved to conform to the behavior of the previous one. They build each model as a cluster of interactive concurrent objects, thus they allow both top-down and bottom-up development

  20. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    Science.gov (United States)

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  1. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    Science.gov (United States)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  2. Common Practices from Two Decades of Water Resources Modelling Published in Environmental Modelling & Software: 1997 to 2016

    Science.gov (United States)

    Ames, D. P.; Peterson, M.; Larsen, J.

    2016-12-01

    A steady flow of manuscripts describing integrated water resources management (IWRM) modelling has been published in Environmental Modelling & Software since the journal's inaugural issue in 1997. These papers represent two decades of peer-reviewed scientific knowledge regarding methods, practices, and protocols for conducting IWRM. We have undertaken to explore this specific assemblage of literature with the intention of identifying commonly reported procedures in terms of data integration methods, modelling techniques, approaches to stakeholder participation, means of communication of model results, and other elements of the model development and application life cycle. Initial results from this effort will be presented including a summary of commonly used practices, and their evolution over the past two decades. We anticipate that results will show a pattern of movement toward greater use of both stakeholder/participatory modelling methods as well as increased use of automated methods for data integration and model preparation. Interestingly, such results could be interpreted to show that the availability of better, faster, and more integrated software tools and technologies free the modeler to take a less technocratic and more human approach to water resources modelling.

  3. Utilizing Visual Effects Software for Efficient and Flexible Isostatic Adjustment Modelling

    Science.gov (United States)

    Meldgaard, A.; Nielsen, L.; Iaffaldano, G.

    2017-12-01

    The isostatic adjustment signal generated by transient ice sheet loading is an important indicator of past ice sheet extent and the rheological constitution of the interior of the Earth. Finite element modelling has proved to be a very useful tool in these studies. We present a simple numerical model for 3D visco elastic Earth deformation and a new approach to the design of such models utilizing visual effects software designed for the film and game industry. The software package Houdini offers an assortment of optimized tools and libraries which greatly facilitate the creation of efficient numerical algorithms. In particular, we make use of Houdini's procedural work flow, the SIMD programming language VEX, Houdini's sparse matrix creation and inversion libraries, an inbuilt tetrahedralizer for grid creation, and the user interface, which facilitates effortless manipulation of 3D geometry. We mitigate many of the time consuming steps associated with the authoring of efficient algorithms from scratch while still keeping the flexibility that may be lost with the use of commercial dedicated finite element programs. We test the efficiency of the algorithm by comparing simulation times with off-the-shelf solutions from the Abaqus software package. The algorithm is tailored for the study of local isostatic adjustment patterns, in close vicinity to present ice sheet margins. In particular, we wish to examine possible causes for the considerable spatial differences in the uplift magnitude which are apparent from field observations in these areas. Such features, with spatial scales of tens of kilometres, are not resolvable with current global isostatic adjustment models, and may require the inclusion of local topographic features. We use the presented algorithm to study a near field area where field observations are abundant, namely, Disko Bay in West Greenland with the intention of constraining Earth parameters and ice thickness. In addition, we assess how local

  4. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  5. An Efficient Technique for Bayesian Modelling of Family Data Using the BUGS software

    Directory of Open Access Journals (Sweden)

    Harold T Bae

    2014-11-01

    Full Text Available Linear mixed models have become a popular tool to analyze continuous data from family-based designs by using random effects that model the correlation of subjects from the same family. However, mixed models for family data are challenging to implement with the BUGS (Bayesian inference Using Gibbs Sampling software because of the high-dimensional covariance matrix of the random effects. This paper describes an efficient parameterization that utilizes the singular value decomposition of the covariance matrix of random effects, includes the BUGS code for such implementation, and extends the parameterization to generalized linear mixed models. The implementation is evaluated using simulated data and an example from a large family-based study is presented with a comparison to other existing methods.

  6. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... whose operational behaviour interacts with a store of constraints, neatly separating product configuration from product behaviour. The resulting probabilistic configurations and behaviour converge seamlessly in a semantics based on DTMCs, thus enabling quantitative analyses ranging from the likelihood...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  7. Comparison of Antenna Array Systems Using OFDM for Software Radio via the SIBIC Model

    Directory of Open Access Journals (Sweden)

    Robert D. Palmer

    2005-09-01

    Full Text Available This paper investigates the performance of two candidates for software radio WLAN, reconfigurable OFDM modulation and antenna diversity, in an indoor environment. The scenario considered is a 20 m×10 m×3 m room with two base units and one mobile unit. The two base units use omnidirectional antennas to transmit and the mobile unit uses either a single antenna with equalizer, a fixed beamformer with equalizer, or an adaptive beamformer with equalizer to receive. The modulation constellation of the data is QPSK and 16-QAM. The response of the channel at the mobile unit is simulated using a three-dimensional indoor WLAN propagation model that generates multipath components with realistic spatial and temporal correlation. An underlying assumption of the scenario is that existing antenna hardware is available and could be exploited if software processing resources are allocated. The results of the simulations indicate that schemes using more resources outperform simpler schemes in most cases. This implies that desired user performance could be used to dynamically assign software processing resources to the demands of a particular indoor WLAN channel if such resources are available.

  8. Surgical model-view-controller simulation software framework for local and collaborative applications.

    Science.gov (United States)

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  9. Formal model-based development for safety-critical embedded software

    International Nuclear Information System (INIS)

    Kim, Jin Hyun; Choi, Jin Young

    2005-01-01

    Safety-critical embedded software for nuclear I and C system is developed under the safety and reliability regulation. Programmable logic controller(PLC) is a computer system for instrumentation and control (I and C) system of nuclear power plants. PLC consists of various I and C logics in software, including real-time operating system (RTOS). Hence, errors related with RTOS should be detected and eliminated in development processes. Practically, the verification and validation for errors in RTOS is performed in test procedure, in which a lot of tasks for testing are embedded in RTOS and are running under a test environments. But the test process can not be enough to guarantee the safety and reliability of RTOS. Therefore, in this paper, we introduce to applying formal methods with the development of software for the PLC. We particularity apply formal methods to a development of RTOS for PLC, which is a safety critical level. In this development, we use the state charts of I-Logix to specify and verification and model checking to verify the specification

  10. On the Characterization and Software Implementation of General Protein Lattice Models

    Science.gov (United States)

    Bechini, Alessio

    2013-01-01

    Abstract models of proteins have been widely used as a practical means to computationally investigate general properties of the system. In lattice models any sterically feasible conformation is represented as a self-avoiding walk on a lattice, and residue types are limited in number. So far, only two- or three-dimensional lattices have been used. The inspection of the neighborhood of alpha carbons in the core of real proteins reveals that also lattices with higher coordination numbers, possibly in higher dimensional spaces, can be adopted. In this paper, a new general parametric lattice model for simplified protein conformations is proposed and investigated. It is shown how the supporting software can be consistently designed to let algorithms that operate on protein structures be implemented in a lattice-agnostic way. The necessary theoretical foundations are developed and organically presented, pinpointing the role of the concept of main directions in lattice-agnostic model handling. Subsequently, the model features across dimensions and lattice types are explored in tests performed on benchmark protein sequences, using a Python implementation. Simulations give insights on the use of square and triangular lattices in a range of dimensions. The trend of potential minimum for sequences of different lengths, varying the lattice dimension, is uncovered. Moreover, an extensive quantitative characterization of the usage of the so-called “move types” is reported for the first time. The proposed general framework for the development of lattice models is simple yet complete, and an object-oriented architecture can be proficiently employed for the supporting software, by designing ad-hoc classes. The proposed framework represents a new general viewpoint that potentially subsumes a number of solutions previously studied. The adoption of the described model pushes to look at protein structure issues from a more general and essential perspective, making computational

  11. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science

    Science.gov (United States)

    de Rigo, Daniele

    2013-04-01

    Computational aspects increasingly shape environmental sciences [1]. Actually, transdisciplinary modelling of complex and uncertain environmental systems is challenging computational science (CS) and also the science-policy interface [2-7]. Large spatial-scale problems falling within this category - i.e. wide-scale transdisciplinary modelling for environment (WSTMe) [8-10] - often deal with factors (a) for which deep-uncertainty [2,11-13] may prevent usual statistical analysis of modelled quantities and need different ways for providing policy-making with science-based support. Here, practical recommendations are proposed for tempering a peculiar - not infrequently underestimated - source of uncertainty. Software errors in complex WSTMe may subtly affect the outcomes with possible consequences even on collective environmental decision-making. Semantic transparency in CS [2,8,10,14,15] and free software [16,17] are discussed as possible mitigations (b) . Software uncertainty, black-boxes and free software. Integrated natural resources modelling and management (INRMM) [29] frequently exploits chains of nontrivial data-transformation models (D- TM), each of them affected by uncertainties and errors. Those D-TM chains may be packaged as monolithic specialized models, maybe only accessible as black-box executables (if accessible at all) [50]. For end-users, black-boxes merely transform inputs in the final outputs, relying on classical peer-reviewed publications for describing the internal mechanism. While software tautologically plays a vital role in CS, it is often neglected in favour of more theoretical aspects. This paradox has been provocatively described as "the invisibility of software in published science. Almost all published papers required some coding, but almost none mention software, let alone include or link to source code" [51]. Recently, this primacy of theory over reality [52-54] has been challenged by new emerging hybrid approaches [55] and by the

  12. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM)

    OpenAIRE

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tama...

  13. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    Science.gov (United States)

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  14. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    International Nuclear Information System (INIS)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker, Charles L. III; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-01-01

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  15. SAHM:VisTrails (Software for Assisted Habitat Modeling for VisTrails): training course

    Science.gov (United States)

    Holcombe, Tracy

    2014-01-01

    VisTrails is an open-source management and scientific workflow system designed to integrate the best of both scientific workflow and scientific visualization systems. Developers can extend the functionality of the VisTrails system by creating custom modules for bundled VisTrails packages. The Invasive Species Science Branch of the U.S. Geological Survey (USGS) Fort Collins Science Center (FORT) and the U.S. Department of the Interior’s North Central Climate Science Center have teamed up to develop and implement such a module—the Software for Assisted Habitat Modeling (SAHM). SAHM expedites habitat modeling and helps maintain a record of the various input data, the steps before and after processing, and the modeling options incorporated in the construction of an ecological response model. There are four main advantages to using the SAHM:VisTrails combined package for species distribution modeling: (1) formalization and tractable recording of the entire modeling process; (2) easier collaboration through a common modeling framework; (3) a user-friendly graphical interface to manage file input, model runs, and output; and (4) extensibility to incorporate future and additional modeling routines and tools. In order to meet increased interest in the SAHM:VisTrails package, the FORT offers a training course twice a year. The course includes a combination of lecture, hands-on work, and discussion. Please join us and other ecological modelers to learn the capabilities of the SAHM:VisTrails package.

  16. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker III, Charles L.; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-02-23

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  17. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  18. Light measurement model to assess software development process improvement Modelo liviano de medidas para evaluar la mejora de procesos de desarrollo de software MLM-PDS

    Directory of Open Access Journals (Sweden)

    Diana Vásquez

    2010-12-01

    Full Text Available Companies in software development in Colombia face a number of problems such as the construction of software in a artesian, empirical and disorganized way. Therefore, it is necessary for these companies to implement projects to improve their development processes, because ensure the quality of products, by improving their software processes, is a step that should give to be able to compete in the market. To implement process improvement models, it is not enough to say whether a company is actually getting benefits, definitely one of the first actions in a to improvement project is to be able to determine the current status of the process. Only by measuring it is possible to know the state of a process in an objective ay, and only through this it is possible to plan strategies and solutions, about improvements to make, depending on the objectives of the organization. This paper proposes a light model to assess software development process, which seeks to help the Colombian software development companies to determine whether the process of implementing improvements, being effective in achieving the objectives and goals set to implement this, through the use of measures to evaluate the process of improving their development processes, allowing characterize the current practices of the company, identifying weaknesses, strengths and abilities of the processes that are carried out within this and thus control or prevent the causes of low quality, or deviations in costs or planning.Las empresas de desarrollo de software en Colombia enfrentan una serie de problemas tales como la construcción de software de forma artesanal, empírica y desorganizada. Por esto, es necesario que implementen proyectos para mejorar sus procesos de desarrollo, ya que asegurar la calidad de los productos,a través de la mejora de sus procesos de software, es un paso que deben dar para estar en condiciones de competir en el mercado nacional e internacional. Implementar modelos

  19. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    International Nuclear Information System (INIS)

    Rozanov, V.V.; Dinter, T.; Rozanov, A.V.; Wolanin, A.; Bracher, A.; Burrows, J.P.

    2017-01-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18–40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean–atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean–atmosphere radiative transfer solver presented by Rozanov et al. we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: (http://www.iup.physik.uni-bremen.de). - Highlights: • A new version of the software package SCIATRAN is presented. • Inelastic scattering in water and atmosphere is implemented in SCIATRAN. • Raman scattering and fluorescence can be included in radiative transfer calculations. • Comparisons to other radiative transfer models show excellent agreement. • Comparisons to observations show consistent results.

  20. Waste Management facilities cost information: System Cost Model Software Quality Assurance Plan. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, B.L.; Lundeen, A.S.

    1996-02-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for truck and rail, which include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation`s generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities. For the product to be effective and useful the SCM users must have a high level of confidence in the data generated by the software model. The SCM Software Quality Assurance Plan is part of the overall SCM project management effort to ensure that the SCM is maintained as a quality product and can be relied on to produce viable planning data. This document defines tasks and deliverables to ensure continued product integrity, provide increased confidence in the accuracy of the data generated, and meet the LITCO`s quality standards during the software maintenance phase. 8 refs., 1 tab.

  1. Waste Management facilities cost information: System Cost Model Software Quality Assurance Plan. Revision 2

    International Nuclear Information System (INIS)

    Peterson, B.L.; Lundeen, A.S.

    1996-02-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for truck and rail, which include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation's generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities. For the product to be effective and useful the SCM users must have a high level of confidence in the data generated by the software model. The SCM Software Quality Assurance Plan is part of the overall SCM project management effort to ensure that the SCM is maintained as a quality product and can be relied on to produce viable planning data. This document defines tasks and deliverables to ensure continued product integrity, provide increased confidence in the accuracy of the data generated, and meet the LITCO's quality standards during the software maintenance phase. 8 refs., 1 tab

  2. A software engineering perspective on environmental modeling framework design: The object modeling system

    Science.gov (United States)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  3. ITS Version 3.0: Powerful, user-friendly software for radiation modelling

    International Nuclear Information System (INIS)

    Kensek, R.P.; Halbleib, J.A.; Valdez, G.D.

    1993-01-01

    ITS (the Integrated Tiger Series) is a powerful, but user-friendly, software package permitting state-of-the-art modelling of electron and/or photon radiation effects. The programs provide Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields. The ITS system combines operational simplicity and physical accuracy in order to provide experimentalist and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems

  4. Control software architecture and operating modes of the Model M-2 maintenance system

    International Nuclear Information System (INIS)

    Satterlee, P.E. Jr.; Martin, H.L.; Herndon, J.N.

    1984-04-01

    The Model M-2 maintenance system is the first completely digitally controlled servomanipulator. The M-2 system allows dexterous operations to be performed remotely using bilateral force-reflecting master/slave techniques, and its integrated operator interface takes advantage of touch-screen-driven menus to allow selection of all possible operating modes. The control system hardware for this system has been described previously. This paper describes the architecture of the overall control system. The system's various modes of operation are identified, the software implementation of each is described, system diagnostic routines are described, and highlights of the computer-augmented operator interface are discussed. 3 references, 5 figures

  5. Control software architecture and operating modes of the Model M-2 maintenance system

    Energy Technology Data Exchange (ETDEWEB)

    Satterlee, P.E. Jr.; Martin, H.L.; Herndon, J.N.

    1984-04-01

    The Model M-2 maintenance system is the first completely digitally controlled servomanipulator. The M-2 system allows dexterous operations to be performed remotely using bilateral force-reflecting master/slave techniques, and its integrated operator interface takes advantage of touch-screen-driven menus to allow selection of all possible operating modes. The control system hardware for this system has been described previously. This paper describes the architecture of the overall control system. The system's various modes of operation are identified, the software implementation of each is described, system diagnostic routines are described, and highlights of the computer-augmented operator interface are discussed. 3 references, 5 figures.

  6. Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics

    Directory of Open Access Journals (Sweden)

    Dubenskaya Julia

    2018-01-01

    Full Text Available We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.

  7. Effect of anti-virus software on infectious nodes in computer network: A mathematical model

    Science.gov (United States)

    Mishra, Bimal Kumar; Pandey, Samir Kumar

    2012-07-01

    An e-epidemic model of malicious codes in the computer network through vertical transmission is formulated. We have observed that if the basic reproduction number is less than unity, the infected proportion of computer nodes disappear and malicious codes die out and also the malicious codes-free equilibrium is globally asymptotically stable which leads to its eradication. Effect of anti-virus software on the removal of the malicious codes from the computer network is critically analyzed. Analysis and simulation results show some managerial insights that are helpful for the practice of anti-virus in information sharing networks.

  8. PREDICTIVE ANALYSIS SOFTWARE FOR MODELING THE ALTMAN Z-SCORE FINANCIAL DISTRESS STATUS OF COMPANIES

    Directory of Open Access Journals (Sweden)

    ILIE RĂSCOLEAN

    2012-10-01

    Full Text Available Literature shows some bankruptcy methods for determining the financial distress status of companies and based on this information we chosen Altman statistical model because it has been used a lot in the past and like that it has become a benchmark for other methods. Based on this financial analysis flowchart, programming software was developed that allows the calculation and determination of the bankruptcy probability for a certain rate of failure Z-score, corresponding to a given interval that is equal to the ratio of the number of bankrupt companies and the total number of companies (bankrupt and healthy interval.

  9. Nuclear model codes and related software distributed by the OECD/NEA Data Bank

    International Nuclear Information System (INIS)

    Sartori, E.

    1993-01-01

    Software and data for nuclear energy applications is acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article covers more specifically the availability of nuclear model codes and also those codes which further process their results into data sets needed for specific nuclear application projects. (author). 2 figs

  10. Solithromycin (CEM-101): A New Fluoroketolide Antibiotic and Its Role in the Treatment of Gonorrhea.

    Science.gov (United States)

    Mancuso, Alexandra M; Gandhi, Mona A; Slish, Judianne

    2018-04-01

    Solithromycin is a macrolide antibiotic that has undergone review for the treatment of community-acquired bacterial pneumonia. Solithromycin is also being investigated and has shown promise for the treatment of gonorrhea. With increasing antibiotic resistance, the development of novel antibiotics to combat infections is essential. The unique ribosome-binding stability of solithromycin and mild side effect profile make this a promising new antibiotic. This article will provide an overview on the mechanism of action, clinical efficacy, and safety of this drug for the treatment of gonorrhea. Relevant data were identified through a comprehensive literature search using multiple databases using the keywords solithromycin, CEM-101, and gonorrhea.

  11. Comparison of CE-MS and LC-MS Analyses of Avian Eggshell Matrix Proteins

    Czech Academy of Sciences Publication Activity Database

    Mikšík, Ivan; Sedláková, Pavla; Mikulíková, Kateřina; Eckhardt, Adam; Kašička, Václav

    2008-01-01

    Roč. 67, Suppl.1 (2008), s. 89-96 ISSN 0009-5893 R&D Projects: GA MŠk(CZ) 1M0510; GA ČR(CZ) GA203/06/1044; GA ČR(CZ) GA203/05/2539 Institutional research plan: CEZ:AV0Z50110509; CEZ:AV0Z40550506 Keywords : eggshell proteins * CE-MS * HPLC-MS Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 1.312, year: 2008

  12. On the optimization of free resources using non-homogeneous Markov chain software rejuvenation model

    International Nuclear Information System (INIS)

    Koutras, Vasilis P.; Platis, Agapios N.; Gravvanis, George A.

    2007-01-01

    Software rejuvenation is an important way to counteract the phenomenon of software aging and system failures. It is a preventive and proactive technique, which consists of periodically restarting an application at a clean internal state. Starting an application generally means that an amount of memory is captured and closing an application engenders the release of an amount of memory. In general, when an application is initiated an amount of memory is captured and when terminated an amount of memory is released. In this paper a model describing the amount of free memory on a system is presented. The modelling is formulated under a continuous time Markov chain framework. Additionally the cost of performing rejuvenation is also taken into consideration, a cost function for the model is produced and a rejuvenation policy is proposed. The contribution of this paper consists of using a cyclic non-homogeneous Markov chain in order to study the overall behaviour of the system capturing time dependence of the rejuvenation rates and deriving an optimal rejuvenation policy. Finally, a case study is presented in order to illustrate the results of the cost analysis

  13. SiGN-SSM: open source parallel software for estimating gene networks with state space models.

    Science.gov (United States)

    Tamada, Yoshinori; Yamaguchi, Rui; Imoto, Seiya; Hirose, Osamu; Yoshida, Ryo; Nagasaki, Masao; Miyano, Satoru

    2011-04-15

    SiGN-SSM is an open-source gene network estimation software able to run in parallel on PCs and massively parallel supercomputers. The software estimates a state space model (SSM), that is a statistical dynamic model suitable for analyzing short time and/or replicated time series gene expression profiles. SiGN-SSM implements a novel parameter constraint effective to stabilize the estimated models. Also, by using a supercomputer, it is able to determine the gene network structure by a statistical permutation test in a practical time. SiGN-SSM is applicable not only to analyzing temporal regulatory dependencies between genes, but also to extracting the differentially regulated genes from time series expression profiles. SiGN-SSM is distributed under GNU Affero General Public Licence (GNU AGPL) version 3 and can be downloaded at http://sign.hgc.jp/signssm/. The pre-compiled binaries for some architectures are available in addition to the source code. The pre-installed binaries are also available on the Human Genome Center supercomputer system. The online manual and the supplementary information of SiGN-SSM is available on our web site. tamada@ims.u-tokyo.ac.jp.

  14. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    Science.gov (United States)

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Mi...

  15. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  16. 3-d finite element model development for biomechanics: a software demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Hollerbach, K.; Hollister, A.M.; Ashby, E.

    1997-03-01

    Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models, using human hand and knee examples, and will demonstrate their software tools.

  17. Mechanistic modelling of cancer: some reflections from software engineering and philosophy of science.

    Science.gov (United States)

    Cañete-Valdeón, José M; Wieringa, Roel; Smallbone, Kieran

    2012-12-01

    There is a growing interest in mathematical mechanistic modelling as a promising strategy for understanding tumour progression. This approach is accompanied by a methodological change of making research, in which models help to actively generate hypotheses instead of waiting for general principles to become apparent once sufficient data are accumulated. This paper applies recent research from philosophy of science to uncover three important problems of mechanistic modelling which may compromise its mainstream application, namely: the dilemma of formal and informal descriptions, the need to express degrees of confidence and the need of an argumentation framework. We report experience and research on similar problems from software engineering and provide evidence that the solutions adopted there can be transferred to the biological domain. We hope this paper can provoke new opportunities for further and profitable interdisciplinary research in the field.

  18. Modeling of the competition life cycle using the software complex of cellular automata PyCAlab

    Science.gov (United States)

    Berg, D. B.; Beklemishev, K. A.; Medvedev, A. N.; Medvedeva, M. A.

    2015-11-01

    The aim of the work is to develop a numerical model of the life cycle of competition on the basis of software complex cellular automata PyCAlab. The model is based on the general patterns of growth of various systems in resource-limited settings. At examples it is shown that the period of transition from an unlimited growth of the market agents to the stage of competitive growth takes quite a long time and may be characterized as monotonic. During this period two main strategies of competitive selection coexist: 1) capture of maximum market space with any reasonable costs; 2) saving by reducing costs. The obtained results allow concluding that the competitive strategies of companies must combine two mentioned types of behavior, and this issue needs to be given adequate attention in the academic literature on management. The created numerical model may be used for market research when developing of the strategies for promotion of new goods and services.

  19. The EQ3/6 software package for geochemical modeling: Current status

    International Nuclear Information System (INIS)

    Worlery, T.J.; Jackson, K.J.; Bourcier, W.L.; Bruton, C.J.; Viani, B.E.; Knauss, K.G.; Delany, J.M.

    1988-07-01

    EQ3/6 is a software package for modeling chemical and mineralogic interactions in aqueous geochemical systems. The major components of the package are EQ3NR (a speciation-solubility code), EQ6 (a reaction path code), EQLIB (a supporting library), and a supporting thermodynamic data base. EQ3NR calculates aqueous speciation and saturation indices from analytical data. It can also be used to calculate compositions of buffer solutions for use in laboratory experiments. EQ6 computes reaction path models of both equilibrium step processes and kinetic reaction processes. These models can be computed for closed systems and relatively simple open systems. EQ3/6 is useful in making purely theoretical calculations, in designing, interpreting, and extrapolating laboratory experiments, and in testing and developing submodels and supporting data used in these codes. The thermodynamic data base supports calculations over the range 0-300 degree C. 60 refs., 2 figs

  20. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    Science.gov (United States)

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  1. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  2. Adding Value to the Network: Exploring the Software as a Service and Platform as a Service Models for Mobile Operators

    Science.gov (United States)

    Gonçalves, Vânia

    The environments of software development and software provision are shifting to Web-based platforms supported by Platform/Software as a Service (PaaS/SaaS) models. This paper will make the case that there is equally an opportunity for mobile operators to identify additional sources of revenue by exposing network functionalities through Web-based service platforms. By elaborating on the concepts, benefits and risks of SaaS and PaaS, several factors that should be taken into consideration in applying these models to the telecom world are delineated.

  3. Development of virtual hands using animation software and graphical modelling; Elaboracao de maos virtuais usando software de animacao e modelagem grafica

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Erick da S.; Junior, Alberico B. de C. [Universidade Federal de Sergipe (UFSE), Sao Cristovao, SE (Brazil)

    2016-07-01

    The numerical dosimetry uses virtual anthropomorphic simulators to represent the human being in computational framework and thus assess the risks associated with exposure to a radioactive source. With the development of computer animation software, the development of these simulators was facilitated using only knowledge of human anatomy to prepare various types of simulators (man, woman, child and baby) in various positions (sitting, standing, running) or part thereof (head, trunk and limbs). These simulators are constructed by loops of handling and due to the versatility of the method, one can create various geometries irradiation was not possible before. In this work, we have built an exhibition of a radiopharmaceutical scenario manipulating radioactive material using animation software and graphical modeling and anatomical database. (author)

  4. Analysis of mice tumor models using dynamic MRI data and a dedicated software platform

    Energy Technology Data Exchange (ETDEWEB)

    Alfke, H.; Maurer, E.; Klose, K.J. [Philipps Univ. Marburg (Germany). Dept. of Radiology; Kohle, S.; Rascher-Friesenhausen, R.; Behrens, S.; Peitgen, H.O. [MeVis - Center for Medical Diagnostic Systems and Visualization, Bremen (Germany); Celik, I. [Philipps Univ. Marburg (Germany). Inst. for Theoretical Surgery; Heverhagen, J.T. [Philipps Univ. Marburg (Germany). Dept. of Radiology; Ohio State Univ., Columbus (United States). Dept. of Radiology

    2004-09-01

    Purpose: To implement a software platform (DynaVision) dedicated to analyze data from functional imaging of tumors with different mathematical approaches, and to test the software platform in pancreatic carcinoma xenografts in mice with severe combined immunodeficiency disease (SCID). Materials and Methods: A software program was developed for extraction and visualization of tissue perfusion parameters from dynamic contrast-enhanced images. This includes regional parameter calculation from enhancement curves, parametric images (e.g., blood flow), animation, 3D visualization, two-compartment modeling a mode for comparing different datasets (e.g., therapy monitoring), and motion correction. We analyzed xenograft tumors from two pancreatic carcinoma cell lines (B x PC3 and ASPC1) implanted in 14 SCID mice after injection of Gd-DTPA into the tail vein. These data were correlated with histopathological findings. Results: Image analysis was completed in approximately 15 minutes per data set. The possibility of drawing and editing ROIs within the whole data set makes it easy to obtain quantitative data from the intensity-time curves. In one animal, motion artifacts reduced the image quality to a greater extent but data analysis was still possible after motion correction. Dynamic MRI of mice tumor models revealed a highly heterogeneous distribution of the contrast-enhancement curves and derived parameters, which correlated with differences in histopathology. ASPc1 tumors showed a more hypervascular type of curves with faster and higher signal enhancement rate (wash-in) and a faster signal decrease (wash-out). BXPC3 tumors showed a more hypovascular type with slower wash-in and wash-out. This correlated with the biological properties of the tumors. (orig.)

  5. Analysis of mice tumor models using dynamic MRI data and a dedicated software platform

    International Nuclear Information System (INIS)

    Alfke, H.; Maurer, E.; Klose, K.J.; Celik, I.; Heverhagen, J.T.; Ohio State Univ., Columbus

    2004-01-01

    Purpose: To implement a software platform (DynaVision) dedicated to analyze data from functional imaging of tumors with different mathematical approaches, and to test the software platform in pancreatic carcinoma xenografts in mice with severe combined immunodeficiency disease (SCID). Materials and Methods: A software program was developed for extraction and visualization of tissue perfusion parameters from dynamic contrast-enhanced images. This includes regional parameter calculation from enhancement curves, parametric images (e.g., blood flow), animation, 3D visualization, two-compartment modeling a mode for comparing different datasets (e.g., therapy monitoring), and motion correction. We analyzed xenograft tumors from two pancreatic carcinoma cell lines (B x PC3 and ASPC1) implanted in 14 SCID mice after injection of Gd-DTPA into the tail vein. These data were correlated with histopathological findings. Results: Image analysis was completed in approximately 15 minutes per data set. The possibility of drawing and editing ROIs within the whole data set makes it easy to obtain quantitative data from the intensity-time curves. In one animal, motion artifacts reduced the image quality to a greater extent but data analysis was still possible after motion correction. Dynamic MRI of mice tumor models revealed a highly heterogeneous distribution of the contrast-enhancement curves and derived parameters, which correlated with differences in histopathology. ASPc1 tumors showed a more hypervascular type of curves with faster and higher signal enhancement rate (wash-in) and a faster signal decrease (wash-out). BXPC3 tumors showed a more hypovascular type with slower wash-in and wash-out. This correlated with the biological properties of the tumors. (orig.)

  6. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    Science.gov (United States)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  7. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  8. SIMPLIFIED CHARGED PARTICLE BEAM TRANSPORT MODELING USING COMMONLY AVAILABLE COMMERCIAL SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    D. Douglas; K. Beard; J. Eldred; P. Evtushenko; A. Jenkins; W. Moore; L. Osborne; D. Sexton; C. Tennant

    2007-06-18

    Particle beam modeling in accelerators has been the focus of considerable effort since the 1950s. Many generations of tools have resulted from this process, each leveraging both prior experience and increases in computer power. However, continuing innovation in accelerator technology results in systems that are not well described by existing tools, so the software development process is on-going. We discuss a novel response to this situation, which was encountered when Jefferson Lab began operation of its energy-recovering linacs. These machines were not readily described with legacy soft-ware; therefore a model was built using Microsoft Excel. This interactive simulation can query data from the accelerator, use it to compute machine parameters, analyze difference orbit data, and evaluate beam properties. It can also derive new accelerator tunings and rapidly evaluate the impact of changes in machine configuration. As it is spreadsheet-based, it can be easily user-modified in response to changing requirements. Examples for the JLab IR Upgrade FEL are presented.

  9. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    Science.gov (United States)

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  10. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  11. Biochemical Network Stochastic Simulator (BioNetS: software for stochastic modeling of biochemical networks

    Directory of Open Access Journals (Sweden)

    Elston Timothy C

    2004-03-01

    Full Text Available Abstract Background Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. Results We have developed the software package Biochemical Network Stochastic Simulator (BioNetS for efficientlyand accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solvesthe appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. Conclusions We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  12. Modelling software failures of digital I and C in probabilistic safety analyses based on the TELEPERM registered XS operating experience

    International Nuclear Information System (INIS)

    Jockenhoevel-Barttfeld, Mariana; Taurines Andre; Baeckstroem, Ola; Holmberg, Jan-Erik; Porthin, Markus; Tyrvaeinen, Tero

    2015-01-01

    Digital instrumentation and control (I and C) systems appear as upgrades in existing nuclear power plants (NPPs) and in new plant designs. In order to assess the impact of digital system failures, quantifiable reliability models are needed along with data for digital systems that are compatible with existing probabilistic safety assessments (PSA). The paper focuses on the modelling of software failures of digital I and C systems in probabilistic assessments. An analysis of software faults, failures and effects is presented to derive relevant failure modes of system and application software for the PSA. The estimations of software failure probabilities are based on an analysis of the operating experience of TELEPERM registered XS (TXS). For the assessment of application software failures the analysis combines the use of the TXS operating experience at an application function level combined with conservative engineering judgments. Failure probabilities to actuate on demand and of spurious actuation of typical reactor protection application are estimated. Moreover, the paper gives guidelines for the modelling of software failures in the PSA. The strategy presented in this paper is generic and can be applied to different software platforms and their applications.

  13. IGMAS+ a new 3D Gravity, FTG and Magnetic Modeling Software

    Science.gov (United States)

    Götze, Hans-Jürgen; Schmidt, Sabine; Fichler, Christine; Planka, Christian

    2010-05-01

    Modern geophysical interpretation requires an interdisciplinary approach, particularly when considering the available amount of 'state of the art' information contained in comprehensive data bases. A combination of different geophysical surveys employing seismics, gravity and geoelectrics, together with geological and petrological studies, can provide new insights into the structures and tectonic evolution of the lithosphere and natural deposits. Interdisciplinary interpretation is essential for any numerical modelling of these structures and the processes acting on them Three-dimensional (3D) interactive modeling with the IGMAS+ software provides means for integrated processing and interpretation of geoid, gravity and magnetic fields and their gradients (full tensor), yielding improved geological interpretation. IGMAS+ is an acronym standing for "Interactive Geophysical Modelling Application System". It bases on the existing software IGMAS (http://www.gravity.uni-kiel.de/igmas), a tool developed during the past twenty years for potential field modelling. The new IGMAS+, however, will comprise the advantages of the "old" IGMAS (e.g. flexible geometry concept and a fast and stable algorithm) with automated interpretation tools and a modern graphical GUI based on leading edge insights from psychological computer graphics research and thus provide optimal man machine communication. IGMAS+ fully three-dimensional models are constructed using triangulated polyhedra and/or triangulated grids, to which constant density and/or induced and remanent susceptibility are assigned. Interactive modifications of model parameters (geometry, density, susceptibility, magnetization), access to the numerical modeling process, and direct visualization of both calculated and measured fields of gravity and magnetics, enable the interpreter to design the model as realistically as possible. IGMAS+ allows easy integration of constraining data into interactive modeling processes

  14. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  15. Glucocorticoids and Polyamine Inhibitors Synergize to Kill Human Leukemic CEM Cells

    Directory of Open Access Journals (Sweden)

    Aaron L. Miller

    2002-01-01

    Full Text Available Glucocorticoids are well-known apoptotic agents in certain classes of lymphoid cell malignancies. Reduction of intracellular polyamine levels by use of inhibitors that block polyamine synthesis slows or inhibits growth of many cells in vitro. Several such inhibitors have shown efficacy in clinical trials, though the toxicity of some compounds has limited their usefulness. We have tested the effects of combinations of the glucocorticoid dexamethasone. (20Dex and two polyamine inhibitors, difluoromethylornithine. (20DFMO and methyl glyoxal bis guanylhydrazone. (20MGBG, on the clonal line of human acute lymphoblastic leukemia cells, CEM-C7-14. Dex alone kills these cells, though only after a delay of at least 24 hours. We also evaluated a partially glucocorticoid-resistant c-Myc-expressing CEM-C7-14 clone. We show that Dex downregulates ornithine decarboxylase. (20ODC, the rate-limiting enzyme in polyamine synthesis. Pretreatment with the ODC inhibitor DFMO, followed by addition of Dex, enhances steroid-evoked kill slightly. The combination of pretreatment with sublethal concentrations of both DFMO and the inhibitor of S-adenosylmethionine decarboxylase, MGBG, followed by addition of Dex, results in strong synergistic cell kill. Both the rapidity and extent of cell kill are enhanced compared to the effects of Dex alone. These results suggest that use of such combinations in vivo may result in apoptosis of malignant cells with lower overall toxicity.

  16. Glucocorticoids and Polyamine Inhibitors Synergize to Kill Human Leukemic CEM Cells1

    Science.gov (United States)

    Miller, Aaron L; Johnson, Betty H; Medh, Rheem D; Townsend, Courtney M; Thompson, E Brad

    2002-01-01

    Abstract Glucocorticoids are well-known apoptotic agents in certain classes of lymphoid cell malignancies. Reduction of intracellular polyamine levels by use of inhibitors that block polyamine synthesis slows or inhibits growth of many cells in vitro. Several such inhibitors have shown efficacy in clinical trials, though the toxicity of some compounds has limited their usefulness. We have tested the effects of combinations of the glucocorticoid dexamethasone (Dex) and two polyamine inhibitors, difluoromethylornithine (DFMO) and methyl glyoxal bis guanylhydrazone (MGBG), on the clonal line of human acute lymphoblastic leukemia cells, CEM-C7-14. Dex alone kills these cells, though only after a delay of at least 24 hours. We also evaluated a partially glucocorticoid-resistant c-Myc-expressing CEM-C7-14 clone. We show that Dex downregulates ornithine decarboxylase (ODC), the rate-limiting enzyme in polyamine synthesis. Pretreatment with the ODC inhibitor DFMO, followed by addition of Dex, enhances steroid-evoked kill slightly. The combination of pretreatment with sublethal concentrations of both DFMO and the inhibitor of S-adenosylmethionine decarboxylase, MGBG, followed by addition of Dex, results in strong synergistic cell kill. Both the rapidity and extent of cell kill are enhanced compared to the effects of Dex alone. These results suggest that use of such combinations in vivo may result in apoptosis of malignant cells with lower overall toxicity. PMID:11922393

  17. An industrial FT-IR process gas analyzer for stack gas cems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Welch, G.M. [American instruments, Anacortes, WA (United States); Herman, B.E. [Applied Automation/Hartmann & Braun, Bartlesville, OK (United States)

    1995-12-31

    This paper describes utilizing Fourier Transform Infrared (FT-IR) technology to meet and exceed EPA requirements to Continuously Monitor Carbon Monoxide (CO) and Sulfur Dioxide (SO){sub 2} in an oil refinery. The application consists of Continuous Emission Monitoring (CEMS) of two stacks from a Fluid Catalytic Cracking unit (FCCU). The discussion will follow the project from initial specifications, installation, start-up, certification results (RATA, 7 day drift), Cylinder Gas Audit (CGA) and the required maintenance. FT-IR is a powerful analytical tool suitable for measurement of stack component gases required to meet CEMS regulations, and allows simultaneous multi-component analysis of complex stack gas streams with a continuous sample stream flow through the measurement cell. The Michelson Interferometer in a unique {open_quotes}Wishbone{close_quotes} design and with a special alignment control enables standardized configuration of the analyzer for flue gas analysis. Normal stack gas pollutants: NO{sub x}, SO{sub 2}, and CO; as well as water soluble pollutants such as NH{sub 3} and HCI may be accurately determined and reported even in the presence of 0-31 Vol % water vapor concentrations (hot and wet). This FT-IR analyzer has been operating with EPA Certification in an oil refinery environment since September 1994.

  18. Software model of a machine vision system based on the common house fly.

    Science.gov (United States)

    Madsen, Robert; Barrett, Steven; Wilcox, Michael

    2005-01-01

    The vision system of the common house fly has many properties, such as hyperacuity and parallel structure, which would be advantageous in a machine vision system. A software model has been developed which is ultimately intended to be a tool to guide the design of an analog real time vision system. The model starts by laying out cartridges over an image. The cartridges are analogous to the ommatidium of the fly's eye and contain seven photoreceptors each with a Gaussian profile. The spacing between photoreceptors is variable providing for more or less detail as needed. The cartridges provide information on what type of features they see and neighboring cartridges share information to construct a feature map.

  19. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  20. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    Science.gov (United States)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.