WorldWideScience

Sample records for software components retrieval

  1. Minimization of Retrieval Time During Software Reuse | Salami ...

    African Journals Online (AJOL)

    Minimization of Retrieval Time During Software Reuse. ... Retrieval of relevant software from the repository during software reuse can be time consuming if the repository contains many ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  2. Software Component Clustering and Retrieval: An Entropy-based Fuzzy k-Modes Methodology

    OpenAIRE

    Stylianou, Constantinos; Andreou, Andreas S.

    2008-01-01

    The number of software houses attempting to adopt a component-based development approach is rapidly increasing. However many organisations still find it difficult to complete the shift as it requires them to alter their entire software development process and philosophy. Furthermore, to promote component-based software engineering, organisations must be ready to promote reusability and this can only be attained if the proper framework exists from which a developer can access, search and retri...

  3. SOFTWARE FOR REGIONS OF INTEREST RETRIEVAL ON MEDICAL 3D IMAGES

    Directory of Open Access Journals (Sweden)

    G. G. Stromov

    2014-01-01

    Full Text Available Background. Implementation of software for areas of interest retrieval in 3D medical images is described in this article. It has been tested against large volume of model MRIs.Material and methods. We tested software against normal and pathological (severe multiple sclerosis model MRIs from tge BrainWeb resource. Technological stack is based on open-source cross-platform solutions. We implemented storage system on Maria DB (an open-sourced fork of MySQL with P/SQL extensions. Python 2.7 scripting was used for automatization of extract-transform-load operations. The computational core is written on Java 7 with Spring framework 3. MongoDB was used as a cache in the cluster of workstations. Maven 3 was chosen as a dependency manager and build system, the project is hosted at Github.Results. As testing on SSMU's LAN has showed, software has been developed is quite efficiently retrieves ROIs are matching for the morphological substratum on pathological MRIs.Conclusion. Automation of a diagnostic process using medical imaging allows to level down the subjective component in decision making and increase the availability of hi-tech medicine. Software has shown in the article is a complex solution for ROI retrieving and segmentation process on model medical images in full-automated mode.We would like to thank Robert Vincent for great help with consulting of usage the BrainWeb resource.

  4. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  5. Development of the software for the component reliability database system of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Hoon; Kim, Seung Hwan; Choi, Sun Young [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    A study was performed to develop the system for the component reliability database which consists of database system to store the reliability data and softwares to analyze the reliability data.This system is a part of KIND (Korea Information System for Nuclear Reliability Database).The MS-SQL database is used to stores the component population data, component maintenance history, and the results of reliability analysis. Two softwares were developed for the component reliability system. One is the KIND-InfoView for the data storing, retrieving and searching. The other is the KIND-CompRel for the statistical analysis of component reliability. 4 refs., 13 figs., 7 tabs. (Author)

  6. Adaptation of Black-Box Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2008-01-01

    Full Text Available The globalization of the software market leads to crucial problems for software companies. More competition between software companies arises and leads to the force on companies to develop ever newer software products in ever shortened time interval. Therefore the time to market for software systems is shortened and obviously the product life cycle is shortened too. Thus software companies shortened the time interval for research and development. Due to the fact of competition between software companies software products have to develop low-priced and this leads to a smaller return on investment. A big challenge for software companies is the use of an effective research and development process to have these problems under control. A way to control these problems can be the reuse of existing software components and adapt those software components to new functionality or accommodate mismatched interfaces. Complete redevelopment of software products is more expensive and time consuming than to develop software components. The approach introduced here presents novel technique together with a supportive environment that enables developers to cope with the adaptability of black-box software components. A supportive environment will be designed that checks the compatibility of black-box software components with the assistance of their specifications. Generated adapter software components can take over the part of adaptation and advance the functionality. Besides, a pool of software components can be used to compose an application to satisfy customer needs. Certainly this pool of software components consists of black-box software components and adapter software components which can be connected on demand.

  7. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  8. Schematic memory components converge within angular gyrus during retrieval

    Science.gov (United States)

    Wagner, Isabella C; van Buuren, Mariët; Kroes, Marijn CW; Gutteling, Tjerk P; van der Linden, Marieke; Morris, Richard G; Fernández, Guillén

    2015-01-01

    Mental schemas form associative knowledge structures that can promote the encoding and consolidation of new and related information. Schemas are facilitated by a distributed system that stores components separately, presumably in the form of inter-connected neocortical representations. During retrieval, these components need to be recombined into one representation, but where exactly such recombination takes place is unclear. Thus, we asked where different schema components are neuronally represented and converge during retrieval. Subjects acquired and retrieved two well-controlled, rule-based schema structures during fMRI on consecutive days. Schema retrieval was associated with midline, medial-temporal, and parietal processing. We identified the multi-voxel representations of different schema components, which converged within the angular gyrus during retrieval. Critically, convergence only happened after 24-hour-consolidation and during a transfer test where schema material was applied to novel but related trials. Therefore, the angular gyrus appears to recombine consolidated schema components into one memory representation. DOI: http://dx.doi.org/10.7554/eLife.09668.001 PMID:26575291

  9. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  10. Project W-211, initial tank retrieval systems, retrieval control system software configuration management plan

    International Nuclear Information System (INIS)

    RIECK, C.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive design package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization

  11. A systematic approach for component-based software development

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis

    2000-01-01

    Component-based software development enables the construction of software artefacts by assembling prefabricated, configurable and independently evolving building blocks, called software components. This paper presents an approach for the development of component-based software artefacts. This

  12. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  13. Neural network-based retrieval from software reuse repositories

    Science.gov (United States)

    Eichmann, David A.; Srinivas, Kankanahalli

    1992-01-01

    A significant hurdle confronts the software reuser attempting to select candidate components from a software repository - discriminating between those components without resorting to inspection of the implementation(s). We outline an approach to this problem based upon neural networks which avoids requiring the repository administrators to define a conceptual closeness graph for the classification vocabulary.

  14. A Combined Approach for Component-based Software Design

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis; Quartel, Dick; Baldoni, R.

    2001-01-01

    Component-based software development enables the construction of software artefacts by assembling binary units of production, distribution and deployment, the so-called software components. Several approaches addressing component-based development have been proposed recently. Most of these

  15. Software Helps Retrieve Information Relevant to the User

    Science.gov (United States)

    Mathe, Natalie; Chen, James

    2003-01-01

    The Adaptive Indexing and Retrieval Agent (ARNIE) is a code library, designed to be used by an application program, that assists human users in retrieving desired information in a hypertext setting. Using ARNIE, the program implements a computational model for interactively learning what information each human user considers relevant in context. The model, called a "relevance network," incrementally adapts retrieved information to users individual profiles on the basis of feedback from the users regarding specific queries. The model also generalizes such knowledge for subsequent derivation of relevant references for similar queries and profiles, thereby, assisting users in filtering information by relevance. ARNIE thus enables users to categorize and share information of interest in various contexts. ARNIE encodes the relevance and structure of information in a neural network dynamically configured with a genetic algorithm. ARNIE maintains an internal database, wherein it saves associations, and from which it returns associated items in response to a query. A C++ compiler for a platform on which ARNIE will be utilized is necessary for creating the ARNIE library but is not necessary for the execution of the software.

  16. Composing simulations using persistent software components

    Energy Technology Data Exchange (ETDEWEB)

    Holland, J.V.; Michelsen, R.E.; Powell, D.R.; Upton, S.C.; Thompson, D.R.

    1999-03-01

    The traditional process for developing large-scale simulations is cumbersome, time consuming, costly, and in some cases, inadequate. The topics of software components and component-based software engineering are being explored by software professionals in academic and industrial settings. A component is a well-delineated, relatively independent, and replaceable part of a software system that performs a specific function. Many researchers have addressed the potential to derive a component-based approach to simulations in general, and a few have focused on military simulations in particular. In a component-based approach, functional or logical blocks of the simulation entities are represented as coherent collections of components satisfying explicitly defined interface requirements. A simulation is a top-level aggregate comprised of a collection of components that interact with each other in the context of a simulated environment. A component may represent a simulation artifact, an agent, or any entity that can generated events affecting itself, other simulated entities, or the state of the system. The component-based approach promotes code reuse, contributes to reducing time spent validating or verifying models, and promises to reduce the cost of development while still delivering tailored simulations specific to analysis questions. The Integrated Virtual Environment for Simulation (IVES) is a composition-centered framework to achieve this potential. IVES is a Java implementation of simulation composition concepts developed at Los Alamos National Laboratory for use in several application domains. In this paper, its use in the military domain is demonstrated via the simulation of dismounted infantry in an urban environment.

  17. Software Components and Formal Methods from a Computational Viewpoint

    OpenAIRE

    Lambertz, Christian

    2012-01-01

    Software components and the methodology of component-based development offer a promising approach to master the design complexity of huge software products because they separate the concerns of software architecture from individual component behavior and allow for reusability of components. In combination with formal methods, the specification of a formal component model of the later software product or system allows for establishing and verifying important system properties in an automatic a...

  18. Unblockable Compositions of Software Components

    DEFF Research Database (Denmark)

    Dong, Ruzhen; Faber, Johannes; Liu, Zhiming

    2012-01-01

    We present a new automata-based interface model describing the interaction behavior of software components. Contrary to earlier component- or interface-based approaches, the interface model we propose specifies all the non-blockable interaction behaviors of a component with any environment...... composition of interface models preserves unblockable sequences of provided services....

  19. Component-based development of software language engineering tools

    NARCIS (Netherlands)

    Ssanyu, J.; Hemerik, C.

    2011-01-01

    In this paper we outline how Software Language Engineering (SLE) could benefit from Component-based Software Development (CBSD) techniques and present an architecture aimed at developing a coherent set of lightweight SLE components, fitting into a general-purpose component framework. In order to

  20. Software Engineering Environment for Component-based Design of Embedded Software

    DEFF Research Database (Denmark)

    Guo, Yu

    2010-01-01

    as well as application models in a computer-aided software engineering environment. Furthermore, component models have been realized following carefully developed design patterns, which provide for an efficient and reusable implementation. The components have been ultimately implemented as prefabricated...... executable objects that can be linked together into an executable application. The development of embedded software using the COMDES framework is supported by the associated integrated engineering environment consisting of a number of tools, which support basic functionalities, such as system modelling......, validation, and executable code generation for specific hardware platforms. Developing such an environment and the associated tools is a highly complex engineering task. Therefore, this thesis has investigated key design issues and analysed existing platforms supporting model-driven software development...

  1. Behavior Protocols for Software Components

    Czech Academy of Sciences Publication Activity Database

    Plášil, František; Višňovský, Stanislav

    2002-01-01

    Roč. 28, č. 11 (2002), s. 1056-1076 ISSN 0098-5589 R&D Projects: GA AV ČR IAA2030902; GA ČR GA201/99/0244 Grant - others:Eureka(XE) Pepita project no.2033 Institutional research plan: AV0Z1030915 Keywords : behavior protocols * component-based programming * software architecture Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.170, year: 2002

  2. Eight-component retrievals from ground-based MAX-DOAS observations

    Directory of Open Access Journals (Sweden)

    H. Irie

    2011-06-01

    Full Text Available We attempt for the first time to retrieve lower-tropospheric vertical profile information for 8 quantities from ground-based Multi-Axis Differential Optical Absorption Spectroscopy (MAX-DOAS observations. The components retrieved are the aerosol extinction coefficients at two wavelengths, 357 and 476 nm, and NO2, HCHO, CHOCHO, H2O, SO2, and O3 volume mixing ratios. A Japanese MAX-DOAS profile retrieval algorithm, version 1 (JM1, is applied to observations performed at Cabauw, the Netherlands (51.97° N, 4.93° E, in June–July 2009 during the Cabauw Intercomparison campaign of Nitrogen Dioxide measuring Instruments (CINDI. Of the retrieved profiles, we focus here on the lowest-layer data (mean values at altitudes 0–1 km, where the sensitivity is usually highest owing to the longest light path. In support of the capability of the multi-component retrievals, we find reasonable overall agreement with independent data sets, including a regional chemical transport model (CHIMERE and in situ observations performed near the surface (2–3 m and at the 200-m height level of the tall tower in Cabauw. Plumes of enhanced HCHO and SO2 were likely affected by biogenic and ship emissions, respectively, and an improvement in their emission strengths is suggested for better agreement between CHIMERE simulations and MAX-DOAS observations. Analysis of air mass factors indicates that the horizontal spatial representativeness of MAX-DOAS observations is about 3–15 km (depending mainly on aerosol extinction, comparable to or better than the spatial resolution of current UV-visible satellite observations and model calculations. These demonstrate that MAX-DOAS provides multi-component data useful for the evaluation of satellite observations and model calculations and can play an important role in bridging different data sets having different spatial resolutions.

  3. A Component-Oriented Programming for Embedded Mobile Robot Software

    Directory of Open Access Journals (Sweden)

    Safaai Deris

    2008-11-01

    Full Text Available Applying software reuse to many Embedded Real-Time (ERT systems poses significant challenges to industrial software processes due to the resource-constrained and real-time requirements of the systems. Autonomous Mobile Robot (AMR system is a class of ERT systems, hence, inherits the challenge of applying software reuse in general ERT systems. Furthermore, software reuse in AMR systems is challenged by the diversities in terms of robot physical size and shape, environmental interaction and implementation platform. Thus, it is envisioned that component-based software engineering will be the suitable way to promote software reuse in AMR systems with consideration to general requirements to be self-contained, platform-independent and real-time predictable. A framework for component-oriented programming for AMR software development using PECOS component model is proposed in this paper. The main features of this framework are: (1 use graphical representation for components definition and composition; (2 target C language for optimal code generation with resource-constrained micro-controller; and (3 minimal requirement for run-time support. Real-time implementation indicates that, the PECOS component model together with the proposed framework is suitable for resource constrained embedded AMR systems software development.

  4. SBA Network Components & Software Inventory

    Data.gov (United States)

    Small Business Administration — SBA’s Network Components & Software Inventory contains a complete inventory of all devices connected to SBA’s network including workstations, servers, routers,...

  5. A Longitudinal Study of the e-Market for Software Components

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Traas, Vincent; Dragt, Roland

    2001-01-01

    Component Based Software Development (CBD) holds high promises, but develops its full potential only when software components are traded in a component market. The Internet seems ideal for this purpose and various sources have predicted a bright future for the Internet Software Component Market

  6. Runtime Concepts of Hierarchical Software Components

    Czech Academy of Sciences Publication Activity Database

    Bureš, Tomáš; Hnětynka, P.; Plášil, František

    2007-01-01

    Roč. 8, special (2007), s. 454-463 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component-based development * hierarchical components * connectors * controlers * runtime environment Subject RIV: JC - Computer Hardware ; Software

  7. Decision criteria for software component sourcing: steps towards a framework

    NARCIS (Netherlands)

    Kusters, R.J.; Pouwelse, L.; Martin, H.; Trienekens, J.J.M.; Hammoudi, Sl.; Maciaszek, L.; Missikoff, M.M.; Camp, O.; Cordeiro, J.

    2016-01-01

    Software developing organizations nowadays have a wide choice when it comes to sourcing software components. This choice ranges from developing or adapting in-house developed components via buying closed source components to utilizing open source components. This study seeks to determine criteria

  8. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  9. Application of fuzzy-MOORA method: Ranking of components for reliability estimation of component-based software systems

    Directory of Open Access Journals (Sweden)

    Zeeshan Ali Siddiqui

    2016-01-01

    Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.

  10. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  11. Software Component Certification: 10 Useful Distinctions

    National Research Council Canada - National Science Library

    Wallnau, Kurt C

    2004-01-01

    .... One persistent and largely unaddressed challenge is how the consumers of software components-that is, the developers of mission-critical systems-can obtain a meaningful level of trust in the runtime...

  12. A Hybrid Hardware and Software Component Architecture for Embedded System Design

    Science.gov (United States)

    Marcondes, Hugo; Fröhlich, Antônio Augusto

    Embedded systems are increasing in complexity, while several metrics such as time-to-market, reliability, safety and performance should be considered during the design of such systems. A component-based design which enables the migration of its components between hardware and software can cope to achieve such metrics. To enable that, we define hybrid hardware and software components as a development artifact that can be deployed by different combinations of hardware and software elements. In this paper, we present an architecture for developing such components in order to construct a repository of components that can migrate between the hardware and software domains to meet the design system requirements.

  13. Transactions in Software Components: Container-Interposed Transactions

    Czech Academy of Sciences Publication Activity Database

    Procházka, M.; Plášil, František

    2002-01-01

    Roč. 3, č. 2 (2002), s. - ISSN 1525-9293 R&D Projects: GA ČR GA201/99/0244; GA AV ČR IAA2030902 Institutional research plan: AV0Z1030915 Keywords : transactions * component-based software architectures * transaction propagation policy * transaction attributes * container -interposed transactions Subject RIV: JC - Computer Hardware ; Software

  14. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    Science.gov (United States)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  15. Software, component, and service deployment in computational Grids

    International Nuclear Information System (INIS)

    von Laszewski, G.; Blau, E.; Bletzinger, M.; Gawor, J.; Lane, P.; Martin, S.; Russell, M.

    2002-01-01

    Grids comprise an infrastructure that enables scientists to use a diverse set of distributed remote services and resources as part of complex scientific problem-solving processes. We analyze some of the challenges involved in deploying software and components transparently in Grids. We report on three practical solutions used by the Globus Project. Lessons learned from this experience lead us to believe that it is necessary to support a variety of software and component deployment strategies. These strategies are based on the hosting environment

  16. Performance Engineering Technology for Scientific Component Software

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  17. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  18. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  19. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  20. A new software suite for NO2 vertical profile retrieval from ground-based zenith-sky spectrometers

    International Nuclear Information System (INIS)

    Denis, L.; Roscoe, H.K.; Chipperfield, M.P.; Roozendael, M. van; Goutail, F.

    2005-01-01

    Here we present an operational method to improve accuracy and information content of ground-based measurements of stratospheric NO 2 . The motive is to improve the investigation of trends in NO 2 , and is important because the current trend in NO 2 appears to contradict the trend in its source, suggesting that the stratospheric circulation has changed. To do so, a new software package for retrieving NO 2 vertical profiles from slant columns measured by zenith-sky spectrometers has been created. It uses a Rodgers optimal linear inverse method coupled with a radiative transfer model for calculations of transfer functions between profiles and columns, and a chemical box model for taking into account the NO 2 variations during twilight and during the day. Each model has parameters that vary according to season and location. Forerunners of each model have been previously validated. The scheme maps random errors in the measurements and systematic errors in the models and their parameters on to the retrieved profiles. Initialisation for models is derived from well-established climatologies. The software has been tested by comparing retrieved profiles to simultaneous balloon-borne profiles at mid-latitudes in spring

  1. A combined Component-Based Approach for the Design of Distributed Software Systems

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Ferreira Pires, Luis; van Sinderen, Marten J.; Quartel, Dick; Yang, H.; Gupta, S.

    2001-01-01

    Component-based software development enables the construction of software artefacts by assembling binary units of production, distribution and deployment, the so-called components. Several approaches to component-based development have been proposed recently. Most of these approaches are based on

  2. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  3. Developing Reusable and Reconfigurable Real-Time Software using Aspects and Components

    OpenAIRE

    Tešanović, Aleksandra

    2006-01-01

    Our main focus in this thesis is on providing guidelines, methods, and tools for design, configuration, and analysis of configurable and reusable real-time software, developed using a combination of aspect-oriented and component-based software development. Specifically, we define a reconfigurable real-time component model (RTCOM) that describes how a real-time component, supporting aspects and enforcing information hiding, could efficiently be designed and implemented. In this context, we out...

  4. Final Report for 'Center for Technology for Advanced Scientific Component Software'

    International Nuclear Information System (INIS)

    Shasharina, Svetlana

    2010-01-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  5. Monitoring extensions for component-based distributed software

    NARCIS (Netherlands)

    Diakov, N.K.; Papir, Z.; van Sinderen, Marten J.; Quartel, Dick

    2000-01-01

    This paper defines a generic class of monitoring extensions to component-based distributed enterprise software. Introducing a monitoring extension to a legacy application system can be very costly. In this paper, we identify the minimum support for application monitoring within the generic

  6. Two Independent Mushroom Body Output Circuits Retrieve the Six Discrete Components of Drosophila Aversive Memory

    Directory of Open Access Journals (Sweden)

    Emna Bouzaiane

    2015-05-01

    Full Text Available Understanding how the various memory components are encoded and how they interact to guide behavior requires knowledge of the underlying neural circuits. Currently, aversive olfactory memory in Drosophila is behaviorally subdivided into four discrete phases. Among these, short- and long-term memories rely, respectively, on the γ and α/β Kenyon cells (KCs, two distinct subsets of the ∼2,000 neurons in the mushroom body (MB. Whereas V2 efferent neurons retrieve memory from α/β KCs, the neurons that retrieve short-term memory are unknown. We identified a specific pair of MB efferent neurons, named M6, that retrieve memory from γ KCs. Moreover, our network analysis revealed that six discrete memory phases actually exist, three of which have been conflated in the past. At each time point, two distinct memory components separately recruit either V2 or M6 output pathways. Memory retrieval thus features a dramatic convergence from KCs to MB efferent neurons.

  7. A new software suite for NO{sub 2} vertical profile retrieval from ground-based zenith-sky spectrometers

    Energy Technology Data Exchange (ETDEWEB)

    Denis, L. [British Antarctic Survey/NERC, Madingley Road, Cambridge CB3 0ET (United Kingdom); Roscoe, H.K. [British Antarctic Survey/NERC, Madingley Road, Cambridge CB3 0ET (United Kingdom)]. E-mail: h.roscoe@bas.ac.uk; Chipperfield, M.P. [Environment Centre, University of Leeds, Leeds LS2 9JT (United Kingdom); Roozendael, M. van [Belgian Institute for Space Aeronomy (BIRA/IASB), 1180 Brussels (Belgium); Goutail, F. [Service d' Aeronomie du CNRS, BP3, 91271 Verrieres le Buisson (France)

    2005-05-15

    Here we present an operational method to improve accuracy and information content of ground-based measurements of stratospheric NO{sub 2}. The motive is to improve the investigation of trends in NO{sub 2}, and is important because the current trend in NO{sub 2} appears to contradict the trend in its source, suggesting that the stratospheric circulation has changed. To do so, a new software package for retrieving NO{sub 2} vertical profiles from slant columns measured by zenith-sky spectrometers has been created. It uses a Rodgers optimal linear inverse method coupled with a radiative transfer model for calculations of transfer functions between profiles and columns, and a chemical box model for taking into account the NO{sub 2} variations during twilight and during the day. Each model has parameters that vary according to season and location. Forerunners of each model have been previously validated. The scheme maps random errors in the measurements and systematic errors in the models and their parameters on to the retrieved profiles. Initialisation for models is derived from well-established climatologies. The software has been tested by comparing retrieved profiles to simultaneous balloon-borne profiles at mid-latitudes in spring.

  8. Software components for medical image visualization and surgical planning

    Science.gov (United States)

    Starreveld, Yves P.; Gobbi, David G.; Finnis, Kirk; Peters, Terence M.

    2001-05-01

    Purpose: The development of new applications in medical image visualization and surgical planning requires the completion of many common tasks such as image reading and re-sampling, segmentation, volume rendering, and surface display. Intra-operative use requires an interface to a tracking system and image registration, and the application requires basic, easy to understand user interface components. Rapid changes in computer and end-application hardware, as well as in operating systems and network environments make it desirable to have a hardware and operating system as an independent collection of reusable software components that can be assembled rapidly to prototype new applications. Methods: Using the OpenGL based Visualization Toolkit as a base, we have developed a set of components that implement the above mentioned tasks. The components are written in both C++ and Python, but all are accessible from Python, a byte compiled scripting language. The components have been used on the Red Hat Linux, Silicon Graphics Iris, Microsoft Windows, and Apple OS X platforms. Rigorous object-oriented software design methods have been applied to ensure hardware independence and a standard application programming interface (API). There are components to acquire, display, and register images from MRI, MRA, CT, Computed Rotational Angiography (CRA), Digital Subtraction Angiography (DSA), 2D and 3D ultrasound, video and physiological recordings. Interfaces to various tracking systems for intra-operative use have also been implemented. Results: The described components have been implemented and tested. To date they have been used to create image manipulation and viewing tools, a deep brain functional atlas, a 3D ultrasound acquisition and display platform, a prototype minimally invasive robotic coronary artery bypass graft planning system, a tracked neuro-endoscope guidance system and a frame-based stereotaxy neurosurgery planning tool. The frame-based stereotaxy module has been

  9. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  10. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  11. Component-Based Software Engineering and Runtime Type Definition

    OpenAIRE

    A. R. Shakurov

    2011-01-01

    The component-based approach to software engineering, its current implementations and their limitations are discussed. A new extended architecture for such systems is presented. Its main architectural concepts and principles are considered.

  12. A Code Generator for Software Component Services in Smart Devices

    OpenAIRE

    Ahmad, Manzoor

    2010-01-01

    A component is built to be reused and reusability has significant impact on component generality and flexibility requirement. A component model plays a critical role in reusability of software component and defines a set of standards for component implementation, evolution, composition, deployment and standardization of the run-time environment for execution of component. In component based development (CBD), standardization of the runtime environment includes specification of component’s int...

  13. A CORBA BASED ARCHITECTURE FOR ACCESSING REUSABLE SOFTWARE COMPONENTS ON THE WEB.

    Directory of Open Access Journals (Sweden)

    R. Cenk ERDUR

    2003-01-01

    Full Text Available In a very near future, as a result of the continious growth of Internet and advances in networking technologies, Internet will become the common software repository for people and organizations who employ component based reuse approach in their software development life cycles. In order to use the reusable components such as source codes, analysis, designs, design patterns during new software development processes, environments that support the identification of the components over Internet are needed. Basic elements of such an environment are the coordinator programs which deliver user requests to appropriate component libraries, user interfaces for querying, and programs that wrap the component libraries. First, a CORBA based architecture is proposed for such an environment. Then, an alternative architecture that is based on the Java 2 platform technologies is given for the same environment. Finally, the two architectures are compared.

  14. The Node Monitoring Component of a Scalable Systems Software Environment

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Samuel James [Iowa State Univ., Ames, IA (United States)

    2006-01-01

    This research describes Fountain, a suite of programs used to monitor the resources of a cluster. A cluster is a collection of individual computers that are connected via a high speed communication network. They are traditionally used by users who desire more resources, such as processing power and memory, than any single computer can provide. A common drawback to effectively utilizing such a large-scale system is the management infrastructure, which often does not often scale well as the system grows. Large-scale parallel systems provide new research challenges in the area of systems software, the programs or tools that manage the system from boot-up to running a parallel job. The approach presented in this thesis utilizes a collection of separate components that communicate with each other to achieve a common goal. While systems software comprises a broad array of components, this thesis focuses on the design choices for a node monitoring component. We will describe Fountain, an implementation of the Scalable Systems Software (SSS) node monitor specification. It is targeted at aggregate node monitoring for clusters, focusing on both scalability and fault tolerance as its design goals. It leverages widely used technologies such as XML and HTTP to present an interface to other components in the SSS environment.

  15. A Component-based Software Development and Execution Framework for CAx Applications

    Directory of Open Access Journals (Sweden)

    N. Matsuki

    2004-01-01

    Full Text Available Digitalization of the manufacturing process and technologies is regarded as the key to increased competitive ability. The MZ-Platform infrastructure is a component-based software development framework, designed for supporting enterprises to enhance digitalized technologies using software tools and CAx components in a self-innovative way. In the paper we show the algorithm, system architecture, and a CAx application example on MZ-Platform. We also propose a new parametric data structure based on MZ-Platform.

  16. Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Damevski, Kostadin [Virginia State Univ., Petersburg, VA (United States)

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  17. Generation of components for software renovation factories from context-free grammars

    NARCIS (Netherlands)

    Brand, van den M.G.J.; Sellink, M.P.A.; Verhoef, C.

    2000-01-01

    We present an approach for the generation of components for a software renovation factory. These components are generated from a contex-free grammar definition that recognizes the code that has to be renovated. We generate analysis and transformation components that can be instantiated with a

  18. Assume-Guarantee Verification of Software Components in SOFA 2 Framework

    Czech Academy of Sciences Publication Activity Database

    Parízek, P.; Plášil, František

    2010-01-01

    Roč. 4, č. 3 (2010), s. 210-221 ISSN 1751-8806 R&D Projects: GA AV ČR 1ET400300504 Grant - others:GA MŠk(CZ) 7E08004 Institutional research plan: CEZ:AV0Z10300504 Keywords : components * software verification * model checking Subject RIV: JC - Computer Hardware ; Software Impact factor: 0.671, year: 2010

  19. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  20. Specification and Generation of Environment for Model Checking of Software Components

    Czech Academy of Sciences Publication Activity Database

    Pařízek, P.; Plášil, František

    2007-01-01

    Roč. 176, - (2007), s. 143-154 ISSN 1571-0661 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  1. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    Science.gov (United States)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  2. 77 FR 35427 - Certain Mobile Devices, Associated Software, and Components Thereof Final Determination of...

    Science.gov (United States)

    2012-06-13

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-744] Certain Mobile Devices, Associated... importation of certain mobile devices, associated software, and components thereof by reason of infringement... importation of certain mobile devices, associated software, and components thereof containing same by reason...

  3. 77 FR 16860 - Certain GPS Navigation Products, Components Thereof, and Related Software; Termination of...

    Science.gov (United States)

    2012-03-22

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-783] Certain GPS Navigation Products, Components Thereof, and Related Software; Termination of Investigation on the Basis of Settlement AGENCY: U.S... GPS navigation products, components thereof, and related software, by reason of the infringement of...

  4. Aquarius' Object-Oriented, Plug and Play Component-Based Flight Software

    Science.gov (United States)

    Murray, Alexander; Shahabuddin, Mohammad

    2013-01-01

    The Aquarius mission involves a combined radiometer and radar instrument in low-Earth orbit, providing monthly global maps of Sea Surface Salinity. Operating successfully in orbit since June, 2011, the spacecraft bus was furnished by the Argentine space agency, Comision Nacional de Actividades Espaciales (CONAE). The instrument, built jointly by NASA's Caltech/JPL and Goddard Space Flight Center, has been successfully producing expectation-exceeding data since it was powered on in August of 2011. In addition to the radiometer and scatterometer, the instrument contains an command & data-handling subsystem with a computer and flight software (FSW) that is responsible for managing the instrument, its operation, and its data. Aquarius' FSW is conceived and architected as a Component-based system, in which the running software consists of a set of Components, each playing a distinctive role in the subsystem, instantiated and connected together at runtime. Component architectures feature a well-defined set of interfaces between the Components, visible and analyzable at the architectural level (see [1]). As we will describe, this kind of an architecture offers significant advantages over more traditional FSW architectures, which often feature a monolithic runtime structure. Component-based software is enabled by Object-Oriented (OO) techniques and languages, the use of which again is not typical in space mission FSW. We will argue in this paper that the use of OO design methods and tools (especially the Unified Modeling Language), as well as the judicious usage of C++, are very well suited to FSW applications, and we will present Aquarius FSW, describing our methods, processes, and design, as a successful case in point.

  5. Towards a Complete Model for Software Component Deployment on Heterogeneous Platform

    Directory of Open Access Journals (Sweden)

    Švogor Ivan

    2014-12-01

    Full Text Available This report briefly describes an ongoing research related to optimization of allocating software components to heterogeneous computing platform (which includes CPU, GPU and FPGA. Research goal is also presented, along with current hot topics of the research area, related research teams, and finally results and contribution of my research. It involves mathematical modelling which results in goal function, optimization method which finds a suboptimal solution to the goal function and a software modeling tool which enables graphical representation of the problem at hand and help developers determine component placement in the system design phase.

  6. Using MDA for integration of heterogeneous components in software supply chains

    NARCIS (Netherlands)

    Hartmann, Johan Herman; Keren, Mila; Matsinger, Aart; Rubin, Julia; Trew, Tim; Yatzkar-Haham, Tali

    2013-01-01

    Software product lines are increasingly built using components from specialized suppliers. A company that is in the middle of a supply chain has to integrate components from its suppliers and offer (partially configured) products to its customers. To satisfy both the variability required by each

  7. CARDS: A blueprint and environment for domain-specific software reuse

    Science.gov (United States)

    Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine

    1992-01-01

    CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'

  8. Management of Globally Distributed Component-Based Software Development Projects

    NARCIS (Netherlands)

    J. Kotlarsky (Julia)

    2005-01-01

    textabstractGlobally Distributed Component-Based Development (GD CBD) is expected to become a promising area, as increasing numbers of companies are setting up software development in a globally distributed environment and at the same time are adopting CBD methodologies. Being an emerging area, the

  9. RAGE Reusable Game Software Components and Their Integration into Serious Game Engines

    NARCIS (Netherlands)

    Van der Vegt, Wim; Nyamsuren, Enkhbold; Westera, Wim

    2016-01-01

    This paper presents and validates a methodology for integrating reusable software components in diverse game engines. While conforming to the RAGE com-ponent-based architecture described elsewhere, the paper explains how the interac-tions and data exchange processes between a reusable software

  10. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enable rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed

  11. Experiments with a novel content-based image retrieval software: can we eliminate classification systems in adolescent idiopathic scoliosis?

    Science.gov (United States)

    Menon, K Venugopal; Kumar, Dinesh; Thomas, Tessamma

    2014-02-01

    Study Design Preliminary evaluation of new tool. Objective To ascertain whether the newly developed content-based image retrieval (CBIR) software can be used successfully to retrieve images of similar cases of adolescent idiopathic scoliosis (AIS) from a database to help plan treatment without adhering to a classification scheme. Methods Sixty-two operated cases of AIS were entered into the newly developed CBIR database. Five new cases of different curve patterns were used as query images. The images were fed into the CBIR database that retrieved similar images from the existing cases. These were analyzed by a senior surgeon for conformity to the query image. Results Within the limits of variability set for the query system, all the resultant images conformed to the query image. One case had no similar match in the series. The other four retrieved several images that were matching with the query. No matching case was left out in the series. The postoperative images were then analyzed to check for surgical strategies. Broad guidelines for treatment could be derived from the results. More precise query settings, inclusion of bending films, and a larger database will enhance accurate retrieval and better decision making. Conclusion The CBIR system is an effective tool for accurate documentation and retrieval of scoliosis images. Broad guidelines for surgical strategies can be made from the postoperative images of the existing cases without adhering to any classification scheme.

  12. Safety prediction for basic components of safety-critical software based on static testing

    International Nuclear Information System (INIS)

    Son, H.S.; Seong, P.H.

    2000-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  13. Effect of component design in retrieved bipolar hip hemiarthroplasty systems.

    Science.gov (United States)

    Hess, Matthew D; Baker, Erin A; Salisbury, Meagan R; Kaplan, Lige M; Greene, Ryan T; Greene, Perry W

    2013-09-01

    Primary articulation of bipolar hemiarthroplasty systems is at the femoral head-liner interface. The purpose of this study was to compare observed damage modes on 36 retrieved bipolar systems with implant, demographic, intraoperative, and radiographic data to elucidate the effects of component design, specifically locking mechanism, on clinical performance. Retrieved bipolar hip hemiarthroplasty systems of 3 different design types were obtained, disassembled, and evaluated macro- and microscopically for varying modes of wear, including abrasion, burnishing, embedding, scratching, and pitting. Clinical record review and radiographic analysis were performed by a senior orthopedic surgery resident. Average bipolar hip hemiarthroplasty system term of service was 46 months (range, 0.27-187 months). All devices contained wear debris captured within the articulating space between the femoral head and liner. In 31% of patients without infection, lucency was observed on immediate prerevision radiographs. The system with a leaf locking mechanism showed significantly increased radiographically observed osteolysis (P=.03) compared with a system with a stopper ring locking mechanism. In addition, implant design and observed damage modes, including pitting and third-body particle embedding, were significantly associated with radiographically observed osteolysis. Copyright 2013, SLACK Incorporated.

  14. Mobile Application Development: Component Retrieval System

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project was to investigate requirements to develop an innovative mobile application to retrieve components’ detailed information from the Stennis...

  15. Safety prediction for basic components of safety critical software based on static testing

    International Nuclear Information System (INIS)

    Son, H.S.; Seong, P.H.

    2001-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, both of which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  16. Proceedings International Workshop on Formal Engineering approaches to Software Components and Architectures

    OpenAIRE

    Kofroň, Jan; Tumova, Jana

    2017-01-01

    These are the proceedings of the 14th International Workshop on Formal Engineering approaches to Software Components and Architectures (FESCA). The workshop was held on April 22, 2017 in Uppsala (Sweden) as a satellite event to the European Joint Conference on Theory and Practice of Software (ETAPS'17). The aim of the FESCA workshop is to bring together junior researchers from formal methods, software engineering, and industry interested in the development and application of formal modelling ...

  17. Operating Experience of Digital, Software-based Components Used in I and C and Electrical Systems in German NPPs

    International Nuclear Information System (INIS)

    Blum, Stefanie; Lochthofen, Andre; Quester, Claudia; Arians, Robert

    2015-01-01

    In recent years, many components in instrumentation and control (I and C) and electrical systems of nuclear power plants (NPPs) were replaced by digital, software-based components. Due to the more complex structure, software-based I and C and electrical components show the potential for new failure mechanisms and an increasing number of failure possibilities, including the potential for common cause failures. An evaluation of the operating experience of digital, software-based components may help to determine new failure modes of these components. In this paper, we give an overview over the results of the evaluation of the operating experience of digital, software-based components used in I and C and electrical systems in NPPs in Germany. (authors)

  18. The RAGE Game Software Components Repository for Supporting Applied Game Development

    Directory of Open Access Journals (Sweden)

    Krassen Stefanov

    2017-09-01

    Full Text Available This paper presents the architecture of the RAGE repository, which is a unique and dedicated infrastructure that provides access to a wide variety of advanced technology components for applied game development. The RAGE project, which is the principal Horizon2020 research and innovation project on applied gaming, develops up to three dozens of software components (RAGE software assets that are reusable across a wide diversity of game engines, game platforms and programming languages. The RAGE repository provides storage space for assets and their artefacts and is designed as an asset life-cycle management system for defining, publishing, updating, searching and packaging for distribution of these assets. It will be embedded in a social platform for asset developers and other users. A dedicated Asset Repository Manager provides the main functionality of the repository and its integration with other systems. Tools supporting the Asset Manager are presented and discussed. When the RAGE repository is in full operation, applied game developers will be able to easily enhance the quality of their games by including selected advanced game software assets. Making available the RAGE repository system and its variety of software assets aims to enhance the coherence and decisiveness of the applied game industry.

  19. Retrieval of Aerosol Components Using Multi-Wavelength Mie-Raman Lidar and Comparison with Ground Aerosol Sampling

    Directory of Open Access Journals (Sweden)

    Yukari Hara

    2018-06-01

    Full Text Available We verified an algorithm using multi-wavelength Mie-Raman lidar (MMRL observations to retrieve four aerosol components (black carbon (BC, sea salt (SS, air pollution (AP, and mineral dust (DS with in-situ aerosol measurements, and determined the seasonal variation of aerosol components in Fukuoka, in the western region of Japan. PM2.5, PM10, and mass concentrations of BC and SS components are derived from in-situ measurements. MMRL provides the aerosol extinction coefficient (α, particle linear depolarization ratio (δ, backscatter coefficient (β, and lidar ratio (S at 355 and 532 nm, and the attenuated backscatter coefficient (βatt at 1064 nm. We retrieved vertical distributions of extinction coefficients at 532 nm for four aerosol components (BC, SS, AP, and DS using 1α532 + 1β532 + 1βatt,1064 + 1δ532 data of MMRL. The retrieved extinction coefficients of the four aerosol components at 532 nm were converted to mass concentrations using the theoretical computed conversion factor assuming the prescribed size distribution, particle shape, and refractive index for each aerosol component. MMRL and in-situ measurements confirmed that seasonal variation of aerosol optical properties was affected by internal/external mixing of various aerosol components, in addition to hygroscopic growth of water-soluble aerosols. MMRL overestimates BC mass concentration compared to in-situ observation using the pure BC model. This overestimation was reduced drastically by introducing the internal mixture model of BC and water-soluble substances (Core-Gray Shell (CGS model. This result suggests that considering the internal mixture of BC and water-soluble substances is essential for evaluating BC mass concentration in this area. Systematic overestimation of BC mass concentration was found during summer, even when we applied the CGS model. The observational facts based on in-situ and MMRL measurements suggested that misclassification of AP as CGS particles was

  20. DOOCS patterns, reusable software components for FPGA based RF GUN field controller

    Energy Technology Data Exchange (ETDEWEB)

    Pucyk, P. [Institute of Electronic Systems, Warsaw (Poland)

    2006-07-01

    Modern accelerator technology combines software and hardware solutions to provide distributed, high efficiency digital systems for High Energy Physics experiments. Providing flexible, maintainable software is crucial for ensuring high availability of the whole system. In order to fulfil all these requirements, appropriate design and development techniques have to be used. Software patterns are well known solution for common programming issues, providing proven development paradigms, which can help to avoid many design issues. DOOCS patterns introduces new concepts of reusable software components for control system algorithms development and implementation in DOOCS framework. Chosen patterns have been described and usage examples have been presented in this paper. (orig.)

  1. DOOCS patterns, reusable software components for FPGA based RF GUN field controller

    International Nuclear Information System (INIS)

    Pucyk, P.

    2006-01-01

    Modern accelerator technology combines software and hardware solutions to provide distributed, high efficiency digital systems for High Energy Physics experiments. Providing flexible, maintainable software is crucial for ensuring high availability of the whole system. In order to fulfil all these requirements, appropriate design and development techniques have to be used. Software patterns are well known solution for common programming issues, providing proven development paradigms, which can help to avoid many design issues. DOOCS patterns introduces new concepts of reusable software components for control system algorithms development and implementation in DOOCS framework. Chosen patterns have been described and usage examples have been presented in this paper. (orig.)

  2. Development of geophysical and geochemical data processing software based on component GIS

    International Nuclear Information System (INIS)

    Ke Dan; Yu Xiang; Wu Qubo; Han Shaoyang; Li Xi

    2013-01-01

    Based on component GIS and mixed programming techniques, a software which combines the basic GIS functions, conventional and unconventional data process methods for the regional geophysical and geochemical data together, is designed and developed. The software has many advantages, such as friendly interface, easy to use and utility functions and provides a useful platform for regional geophysical and geochemical data processing. (authors)

  3. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Kostadin, Damevski [Virginia State Univ., Petersburg, VA (United States)

    2015-01-25

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  4. A configurable component-based software system for magnetic field measurements

    Energy Technology Data Exchange (ETDEWEB)

    Nogiec, J.M.; DiMarco, J.; Kotelnikov, S.; Trombly-Freytag, K.; Walbridge, D.; Tartaglia, M.; /Fermilab

    2005-09-01

    A new software system to test accelerator magnets has been developed at Fermilab. The magnetic measurement technique involved employs a single stretched wire to measure alignment parameters and magnetic field strength. The software for the system is built on top of a flexible component-based framework, which allows for easy reconfiguration and runtime modification. Various user interface, data acquisition, analysis, and data persistence components can be configured to form different measurement systems that are tailored to specific requirements (e.g., involving magnet type or test stand). The system can also be configured with various measurement sequences or tests, each of them controlled by a dedicated script. It is capable of working interactively as well as executing a preselected sequence of tests. Each test can be parameterized to fit the specific magnet type or test stand requirements. The system has been designed with portability in mind and is capable of working on various platforms, such as Linux, Solaris, and Windows. It can be configured to use a local data acquisition subsystem or a remote data acquisition computer, such as a VME processor running VxWorks. All hardware-oriented components have been developed with a simulation option that allows for running and testing measurements in the absence of data acquisition hardware.

  5. Surgical planning of total hip arthroplasty: accuracy of computer-assisted EndoMap software in predicting component size

    International Nuclear Information System (INIS)

    Davila, Jesse A.; Kransdorf, Mark J.; Duffy, Gavan P.

    2006-01-01

    The purpose of our study was to assess the accuracy of a computer-assisted templating in the surgical planning of patients undergoing total hip arthroplasty utilizing EndoMap software (Siemans AG, Medical Solutions, Erlangen, Germany). Endomap Software is an electronic program that uses DICOM images to analyze standard anteroposterior radiographs for determination of optimal prosthesis component size. We retrospectively reviewed the preoperative radiographs of 36 patients undergoing uncomplicated primary total hip arthroplasty, utilizing EndoMap software, Version VA20. DICOM anteroposterior radiographs were analyzed using standard manufacturer supplied electronic templates to determine acetabular and femoral component sizes. No additional clinical information was reviewed. Acetabular and femoral component sizes were assessed by an orthopedic surgeon and two radiologists. Mean and estimated component size was compared with component size as documented in operative reports. The mean estimated acetabular component size was 53 mm (range 48-60 mm), 1 mm larger than the mean implanted size of 52 mm (range 48-62 mm). Thirty-one of 36 acetabular component sizes (86%) were accurate within one size. The mean calculated femoral component size was 4 (range 2-7), 1 size smaller than the actual mean component size of 5 (range 2-9). Twenty-six of 36 femoral component sizes (72%) were accurate within one size, and accurate within two sizes in all but four cases (94%). EndoMap Software predicted femoral component size well, with 72% within one component size of that used, and 94% within two sizes. Acetabular component size was predicted slightly better with 86% within one component size and 94% within two component sizes. (orig.)

  6. A Sociotechnical Negotiation Mechanism to Support Component Markets in Software Ecosystems

    Directory of Open Access Journals (Sweden)

    Rodrigo Santos

    2017-12-01

    Full Text Available Organizations have opened up their software platforms and reusable assets to others, including partners and third-party developers around the world, creating software ecosystems (SECOs. This perspective can contribute to minimize nontechnical barriers of software reuse in industry because it explores potential benefits from the relations among companies and stakeholders. An inhibitor is the complexity in defining value for reusable assets in a scenario where producers try to meet customers’ expectations, and vice-versa. In this paper, we present a value-based mechanism to support component negotiation and socialization processes in a reuse repository in the SECO context as an extension of the Brechó-EcoSys environment. Social resources were integrated into the mechanism in order to aid component negotiation. An evaluation of the negotiation mechanism was initially performed based on an analysis of its elements and functions against critical factors in the negotiation within a SECO, identified in a previous systematic literature review. In addition, an analysis of the social resources supporting the negotiation mechanism was performed against popular sociotechnical elements for SECOs, identified in a previous survey with experts in the field. Finally, the negotiation process and the potential support provided by sociotechnical resources were investigated through an observational study where participants were engaged in some tasks playing as consumer and producers using the sociotechnical negotiation mechanism at Brechó-EcoSys environment. We concluded that sociotechnical resources (e.g., forum and tag cloud support component producers and consumers with useful information from the SECO community.

  7. Linearization of the Principal Component Analysis method for radiative transfer acceleration: Application to retrieval algorithms and sensitivity studies

    International Nuclear Information System (INIS)

    Spurr, R.; Natraj, V.; Lerot, C.; Van Roozendael, M.; Loyola, D.

    2013-01-01

    Principal Component Analysis (PCA) is a promising tool for enhancing radiative transfer (RT) performance. When applied to binned optical property data sets, PCA exploits redundancy in the optical data, and restricts the number of full multiple-scatter calculations to those optical states corresponding to the most important principal components, yet still maintaining high accuracy in the radiance approximations. We show that the entire PCA RT enhancement process is analytically differentiable with respect to any atmospheric or surface parameter, thus allowing for accurate and fast approximations of Jacobian matrices, in addition to radiances. This linearization greatly extends the power and scope of the PCA method to many remote sensing retrieval applications and sensitivity studies. In the first example, we examine accuracy for PCA-derived UV-backscatter radiance and Jacobian fields over a 290–340 nm window. In a second application, we show that performance for UV-based total ozone column retrieval is considerably improved without compromising the accuracy. -- Highlights: •Principal Component Analysis (PCA) of spectrally-binned atmospheric optical properties. •PCA-based accelerated radiative transfer with 2-stream model for fast multiple-scatter. •Atmospheric and surface property linearization of this PCA performance enhancement. •Accuracy of PCA enhancement for radiances and bulk-property Jacobians, 290–340 nm. •Application of PCA speed enhancement to UV backscatter total ozone retrievals

  8. Recover the story of a component or the determination of the welding residual stresses

    International Nuclear Information System (INIS)

    Genette, P.; Dupas, Ph.; Waeckel, F.

    1998-01-01

    Mechanical components in nuclear power plants can keep track of the welding processes they had undergone before to entrying into service. The memory of these past events can postpone or enhance possible damage phenomena on these components. Nowadays, numerical simulation software, such as the Code ASTER, enable to reproduce numerically these welding processes so that their mechanical consequences be retrieved. (authors)

  9. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  10. The GRAPE aerosol retrieval algorithm

    Directory of Open Access Journals (Sweden)

    G. E. Thomas

    2009-11-01

    Full Text Available The aerosol component of the Oxford-Rutherford Aerosol and Cloud (ORAC combined cloud and aerosol retrieval scheme is described and the theoretical performance of the algorithm is analysed. ORAC is an optimal estimation retrieval scheme for deriving cloud and aerosol properties from measurements made by imaging satellite radiometers and, when applied to cloud free radiances, provides estimates of aerosol optical depth at a wavelength of 550 nm, aerosol effective radius and surface reflectance at 550 nm. The aerosol retrieval component of ORAC has several incarnations – this paper addresses the version which operates in conjunction with the cloud retrieval component of ORAC (described by Watts et al., 1998, as applied in producing the Global Retrieval of ATSR Cloud Parameters and Evaluation (GRAPE data-set.

    The algorithm is described in detail and its performance examined. This includes a discussion of errors resulting from the formulation of the forward model, sensitivity of the retrieval to the measurements and a priori constraints, and errors resulting from assumptions made about the atmospheric/surface state.

  11. Proceedings 10th International Workshop on Formal Engineering Approaches to Software Components and Architectures

    OpenAIRE

    Buhnova, Barbora; Happe, Lucia; Kofroň, Jan

    2013-01-01

    These are the proceedings of the 10th International Workshop on Formal Engineering approaches to Software Components and Architectures (FESCA). The workshop was held on March 23, 2013 in Rome (Italy) as a satellite event to the European Joint Conference on Theory and Practice of Software (ETAPS'13). The aim of the FESCA workshop is to bring together both young and senior researchers from formal methods, software engineering, and industry interested in the development and application of formal...

  12. Uniframe: A Unified Framework for Developing Service-Oriented, Component-Based Distributed Software Systems

    National Research Council Canada - National Science Library

    Raje, Rajeev R; Olson, Andrew M; Bryant, Barrett R; Burt, Carol C; Auguston, Makhail

    2005-01-01

    .... It describes how this approach employs a unifying framework for specifying such systems to unite the concepts of service-oriented architectures, a component-based software engineering methodology...

  13. In Search of the Philosopher's Stone: Simulation Composability Versus Component-Based Software Design

    National Research Council Canada - National Science Library

    Bartholet, Robert G; Brogan, David C; Reynolds, Jr., Paul F; Carnahan, Joseph C

    2004-01-01

    The simulation community and the software engineering community are actively conducting research on technology that will make it possible to easily build complex systems by combining existing components...

  14. A Bayesian Retrieval of Greenland Ice Sheet Internal Temperature from Ultra-wideband Software-defined Microwave Radiometer (UWBRAD) Measurements

    Science.gov (United States)

    Duan, Y.; Durand, M. T.; Jezek, K. C.; Yardim, C.; Bringer, A.; Aksoy, M.; Johnson, J. T.

    2017-12-01

    The ultra-wideband software-defined microwave radiometer (UWBRAD) is designed to provide ice sheet internal temperature product via measuring low frequency microwave emission. Twelve channels ranging from 0.5 to 2.0 GHz are covered by the instrument. A Greenland air-borne demonstration was demonstrated in September 2016, provided first demonstration of Ultra-wideband radiometer observations of geophysical scenes, including ice sheets. Another flight is planned for September 2017 for acquiring measurements in central ice sheet. A Bayesian framework is designed to retrieve the ice sheet internal temperature from simulated UWBRAD brightness temperature (Tb) measurements over Greenland flight path with limited prior information of the ground. A 1-D heat-flow model, the Robin Model, was used to model the ice sheet internal temperature profile with ground information. Synthetic UWBRAD Tb observations was generated via the partially coherent radiation transfer model, which utilizes the Robin model temperature profile and an exponential fit of ice density from Borehole measurement as input, and corrupted with noise. The effective surface temperature, geothermal heat flux, the variance of upper layer ice density, and the variance of fine scale density variation at deeper ice sheet were treated as unknown variables within the retrieval framework. Each parameter is defined with its possible range and set to be uniformly distributed. The Markov Chain Monte Carlo (MCMC) approach is applied to make the unknown parameters randomly walk in the parameter space. We investigate whether the variables can be improved over priors using the MCMC approach and contribute to the temperature retrieval theoretically. UWBRAD measurements near camp century from 2016 was also treated with the MCMC to examine the framework with scattering effect. The fine scale density fluctuation is an important parameter. It is the most sensitive yet highly unknown parameter in the estimation framework

  15. Analysis of Carbon Fiber Reinforced PEEK Hinge Mechanism Articulation Components in a Rotating Hinge Knee Design: A Comparison of In Vitro and Retrieval Findings.

    Science.gov (United States)

    Schierjott, Ronja A; Giurea, Alexander; Neuhaus, Hans-Joachim; Schwiesau, Jens; Pfaff, Andreas M; Utzschneider, Sandra; Tozzi, Gianluca; Grupp, Thomas M

    2016-01-01

    Carbon fiber reinforced poly-ether-ether-ketone (CFR-PEEK) represents a promising alternative material for bushings in total knee replacements, after early clinical failures of polyethylene in this application. The objective of the present study was to evaluate the damage modes and the extent of damage observed on CFR-PEEK hinge mechanism articulation components after in vivo service in a rotating hinge knee (RHK) system and to compare the results with corresponding components subjected to in vitro wear tests. Key question was if there were any similarities or differences between in vivo and in vitro damage characteristics. Twelve retrieved RHK systems after an average of 34.9 months in vivo underwent wear damage analysis with focus on the four integrated CFR-PEEK components and distinction between different damage modes and classification with a scoring system. The analysis included visual examination, scanning electron microscopy, and energy dispersive X-ray spectroscopy, as well as surface roughness and profile measurements. The main wear damage modes were comparable between retrieved and in vitro specimens ( n = 3), whereby the size of affected area on the retrieved components showed a higher variation. Overall, the retrieved specimens seemed to be slightly heavier damaged which was probably attributable to the more complex loading and kinematic conditions in vivo.

  16. Analysis of Carbon Fiber Reinforced PEEK Hinge Mechanism Articulation Components in a Rotating Hinge Knee Design: A Comparison of In Vitro and Retrieval Findings

    Directory of Open Access Journals (Sweden)

    Ronja A. Schierjott

    2016-01-01

    Full Text Available Carbon fiber reinforced poly-ether-ether-ketone (CFR-PEEK represents a promising alternative material for bushings in total knee replacements, after early clinical failures of polyethylene in this application. The objective of the present study was to evaluate the damage modes and the extent of damage observed on CFR-PEEK hinge mechanism articulation components after in vivo service in a rotating hinge knee (RHK system and to compare the results with corresponding components subjected to in vitro wear tests. Key question was if there were any similarities or differences between in vivo and in vitro damage characteristics. Twelve retrieved RHK systems after an average of 34.9 months in vivo underwent wear damage analysis with focus on the four integrated CFR-PEEK components and distinction between different damage modes and classification with a scoring system. The analysis included visual examination, scanning electron microscopy, and energy dispersive X-ray spectroscopy, as well as surface roughness and profile measurements. The main wear damage modes were comparable between retrieved and in vitro specimens (n=3, whereby the size of affected area on the retrieved components showed a higher variation. Overall, the retrieved specimens seemed to be slightly heavier damaged which was probably attributable to the more complex loading and kinematic conditions in vivo.

  17. Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components

    Science.gov (United States)

    Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio

    1997-01-01

    In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.

  18. The ATLAS online High Level Trigger framework experience reusing offline software components in the ATLAS trigger

    CERN Document Server

    Wiedenmann, W

    2009-01-01

    Event selection in the Atlas High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The Atlas High Level Trigger (HLT) framework based on the Gaudi and Atlas Athena frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of Atlas, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking peri...

  19. An application of machine learning to the organization of institutional software repositories

    Science.gov (United States)

    Bailin, Sidney; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.

  20. Positive facial expressions during retrieval of self-defining memories.

    Science.gov (United States)

    Gandolphe, Marie Charlotte; Nandrino, Jean Louis; Delelis, Gérald; Ducro, Claire; Lavallee, Audrey; Saloppe, Xavier; Moustafa, Ahmed A; El Haj, Mohamad

    2017-11-14

    In this study, we investigated, for the first time, facial expressions during the retrieval of Self-defining memories (i.e., those vivid and emotionally intense memories of enduring concerns or unresolved conflicts). Participants self-rated the emotional valence of their Self-defining memories and autobiographical retrieval was analyzed with a facial analysis software. This software (Facereader) synthesizes the facial expression information (i.e., cheek, lips, muscles, eyebrow muscles) to describe and categorize facial expressions (i.e., neutral, happy, sad, surprised, angry, scared, and disgusted facial expressions). We found that participants showed more emotional than neutral facial expressions during the retrieval of Self-defining memories. We also found that participants showed more positive than negative facial expressions during the retrieval of Self-defining memories. Interestingly, participants attributed positive valence to the retrieved memories. These findings are the first to demonstrate the consistency between facial expressions and the emotional subjective experience of Self-defining memories. These findings provide valuable physiological information about the emotional experience of the past.

  1. Development of software for computing forming information using a component based approach

    Directory of Open Access Journals (Sweden)

    Kwang Hee Ko

    2009-12-01

    Full Text Available In shipbuilding industry, the manufacturing technology has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology, however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary to create a “plug-in” framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a framework for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology, which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  2. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Part 1, Overview of NUCLARR data retrieval: User's guide

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gentillon, C.D.; Gertman, D.I.; Beers, G.H.; Galyean, W.J.; Gilbert, B.G.

    1988-06-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability and hardware component failure data. The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. NUCLARR can be used by the end user to furnish data inputs for both human and hardware reliability analysis in support of a variety of risk assessment activities. The NUCLARR system is documented in a five-volume series of reports. Volume IV of this series is the User's Guide for operating the NUCLARR software and is presented in three parts. This document, Part 1: Overview of NUCLARR Data Retrieval provides an introductory overview to the system's capabilities and procedures for data retrieval. The methods and criteria for selection of data sources and entering them into the NUCLARR system are also described in this document

  3. Computer software program for monitoring the availability of systems and components of electric power generating systems

    International Nuclear Information System (INIS)

    Petersen, T.A.; Hilsmeier, T.A.; Kapinus, D.M.

    1994-01-01

    As availabilities of electric power generating stations systems and components become more and more important from a financial, personnel safety, and regulatory requirements standpoint, it is evident that a comprehensive, yet simple and user-friendly program for system and component tracking and monitoring is needed to assist in effectively managing the large volume of systems and components with their large numbers of associated maintenance/availability records. A user-friendly computer software program for system and component availability monitoring has been developed that calculates, displays and monitors selected component and system availabilities. This is a Windows trademark based (Graphical User Interface) program that utilizes a system flow diagram for the data input screen which also provides a visual representation of availability values and limits for the individual components and associated systems. This program can be customized to the user's plant-specific system and component selections and configurations. As will be discussed herein, this software program is well suited for availability monitoring and ultimately providing valuable information for improving plant performance and reducing operating costs

  4. The ATLAS online High Level Trigger framework: Experience reusing offline software components in the ATLAS trigger

    International Nuclear Information System (INIS)

    Wiedenmann, Werner

    2010-01-01

    Event selection in the ATLAS High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The ATLAS High Level Trigger (HLT) framework based on the GAUDI and ATLAS ATHENA frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of ATLAS, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking periods with cosmic events and in a short period with proton beams from LHC. The contribution discusses the architectural aspects of the HLT framework, its performance and its software environment within the ATLAS computing, trigger and data flow projects. Emphasis is also put on the architectural implications for the software by the use of multi-core processors in the computing farms and the experiences gained with multi-threading and multi-process technologies.

  5. 77 FR 40082 - Certain Gaming and Entertainment Consoles, Related Software, and Components Thereof...

    Science.gov (United States)

    2012-07-06

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-745] Certain Gaming and Entertainment... gaming and entertainment consoles, related software, and components thereof by reason of infringement of... finally concluded that an industry exists within the United States that practices the '896, '094, '571...

  6. Evolutionary Computing Methods for Spectral Retrieval

    Science.gov (United States)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  7. EM-21 Retrieval Knowledge Center: Waste Retrieval Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Fellinger, Andrew P.; Rinker, Michael W.; Berglin, Eric J.; Minichan, Richard L.; Poirier, Micheal R.; Gauglitz, Phillip A.; Martin, Bruce A.; Hatchell, Brian K.; Saldivar, Eloy; Mullen, O Dennis; Chapman, Noel F.; Wells, Beric E.; Gibbons, Peter W.

    2009-04-10

    EM-21 is the Waste Processing Division of the Office of Engineering and Technology, within the U.S. Department of Energy’s (DOE) Office of Environmental Management (EM). In August of 2008, EM-21 began an initiative to develop a Retrieval Knowledge Center (RKC) to provide the DOE, high level waste retrieval operators, and technology developers with centralized and focused location to share knowledge and expertise that will be used to address retrieval challenges across the DOE complex. The RKC is also designed to facilitate information sharing across the DOE Waste Site Complex through workshops, and a searchable database of waste retrieval technology information. The database may be used to research effective technology approaches for specific retrieval tasks and to take advantage of the lessons learned from previous operations. It is also expected to be effective for remaining current with state-of-the-art of retrieval technologies and ongoing development within the DOE Complex. To encourage collaboration of DOE sites with waste retrieval issues, the RKC team is co-led by the Savannah River National Laboratory (SRNL) and the Pacific Northwest National Laboratory (PNNL). Two RKC workshops were held in the Fall of 2008. The purpose of these workshops was to define top level waste retrieval functional areas, exchange lessons learned, and develop a path forward to support a strategic business plan focused on technology needs for retrieval. The primary participants involved in these workshops included retrieval personnel and laboratory staff that are associated with Hanford and Savannah River Sites since the majority of remaining DOE waste tanks are located at these sites. This report summarizes and documents the results of the initial RKC workshops. Technology challenges identified from these workshops and presented here are expected to be a key component to defining future RKC-directed tasks designed to facilitate tank waste retrieval solutions.

  8. EM-21 Retrieval Knowledge Center: Waste Retrieval Challenges

    International Nuclear Information System (INIS)

    Fellinger, Andrew P.; Rinker, Michael W.; Berglin, Eric J.; Minichan, Richard L.; Poirier, Micheal R.; Gauglitz, Phillip A.; Martin, Bruce A.; Hatchell, Brian K.; Saldivar, Eloy; Mullen, O Dennis; Chapman, Noel F.; Wells, Beric E.; Gibbons, Peter W.

    2009-01-01

    EM-21 is the Waste Processing Division of the Office of Engineering and Technology, within the U.S. Department of Energy's (DOE) Office of Environmental Management (EM). In August of 2008, EM-21 began an initiative to develop a Retrieval Knowledge Center (RKC) to provide the DOE, high level waste retrieval operators, and technology developers with centralized and focused location to share knowledge and expertise that will be used to address retrieval challenges across the DOE complex. The RKC is also designed to facilitate information sharing across the DOE Waste Site Complex through workshops, and a searchable database of waste retrieval technology information. The database may be used to research effective technology approaches for specific retrieval tasks and to take advantage of the lessons learned from previous operations. It is also expected to be effective for remaining current with state-of-the-art of retrieval technologies and ongoing development within the DOE Complex. To encourage collaboration of DOE sites with waste retrieval issues, the RKC team is co-led by the Savannah River National Laboratory (SRNL) and the Pacific Northwest National Laboratory (PNNL). Two RKC workshops were held in the Fall of 2008. The purpose of these workshops was to define top level waste retrieval functional areas, exchange lessons learned, and develop a path forward to support a strategic business plan focused on technology needs for retrieval. The primary participants involved in these workshops included retrieval personnel and laboratory staff that are associated with Hanford and Savannah River Sites since the majority of remaining DOE waste tanks are located at these sites. This report summarizes and documents the results of the initial RKC workshops. Technology challenges identified from these workshops and presented here are expected to be a key component to defining future RKC-directed tasks designed to facilitate tank waste retrieval solutions

  9. The verification methodologies for a software modeling of Engineered Safety Features- Component Control System (ESF-CCS)

    International Nuclear Information System (INIS)

    Lee, Young-Jun; Cheon, Se-Woo; Cha, Kyung-Ho; Park, Gee-Yong; Kwon, Kee-Choon

    2007-01-01

    The safety of a software is not guaranteed through a simple testing of the software. The testing reviews only the static functions of a software. The behavior, dynamic state of a software is not reviewed by a software testing. The Ariane5 rocket accident and the failure of the Virtual Case File Project are determined by a software fault. Although this software was tested thoroughly, the potential errors existed internally. There are a lot of methods to solve these problems. One of the methods is a formal methodology. It describes the software requirements as a formal specification during a software life cycle and verifies a specified design. This paper suggests the methods which verify the design to be described as a formal specification. We adapt these methods to the software of a ESF-CCS (Engineered Safety Features-Component Control System) and use the SCADE (Safety Critical Application Development Environment) tool for adopting the suggested verification methods

  10. Functional-anatomic study of episodic retrieval using fMRI. I. Retrieval effort versus retrieval success.

    Science.gov (United States)

    Buckner, R L; Koutstaal, W; Schacter, D L; Wagner, A D; Rosen, B R

    1998-04-01

    A number of recent functional imaging studies have identified brain areas activated during tasks involving episodic memory retrieval. The identification of such areas provides a foundation for targeted hypotheses regarding the more specific contributions that these areas make to episodic retrieval. As a beginning effort toward such an endeavor, whole-brain functional magnetic resonance imaging (fMRI) was used to examine 14 subjects during episodic word recognition in a block-designed fMRI experiment. Study conditions were manipulated by presenting either shallow or deep encoding tasks. This manipulation yielded two recognition conditions that differed with regard to retrieval effort and retrieval success: shallow encoding yielded low levels of recognition success with high levels of retrieval effort, and deep encoding yielded high levels of recognition success with low levels of effort. Many brain areas were activated in common by these two recognition conditions compared to a low-level fixation condition, including left and right prefrontal regions often detected during PET episodic retrieval paradigms (e.g., R. L. Buckner et al., 1996, J. Neurosci. 16, 6219-6235) thereby generalizing these findings to fMRI. Characterization of the activated regions in relation to the separate recognition conditions showed (1) bilateral anterior insular regions and a left dorsal prefrontal region were more active after shallow encoding, when retrieval demanded greatest effort, and (2) right anterior prefrontal cortex, which has been implicated in episodic retrieval, was most active during successful retrieval after deep encoding. We discuss these findings in relation to component processes involved in episodic retrieval and in the context of a companion study using event-related fMRI.

  11. Software for Managing Personal Files.

    Science.gov (United States)

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  12. Information retrieval implementing and evaluating search engines

    CERN Document Server

    Büttcher, Stefan; Cormack, Gordon V

    2016-01-01

    Information retrieval is the foundation for modern search engines. This textbook offers an introduction to the core topics underlying modern search technologies, including algorithms, data structures, indexing, retrieval, and evaluation. The emphasis is on implementation and experimentation; each chapter includes exercises and suggestions for student projects. Wumpus -- a multiuser open-source information retrieval system developed by one of the authors and available online -- provides model implementations and a basis for student work. The modular structure of the book allows instructors to use it in a variety of graduate-level courses, including courses taught from a database systems perspective, traditional information retrieval courses with a focus on IR theory, and courses covering the basics of Web retrieval. In addition to its classroom use, Information Retrieval will be a valuable reference for professionals in computer science, computer engineering, and software engineering.

  13. 78 FR 32690 - Certain Gaming and Entertainment Consoles, Related Software, and Components Thereof; Notice of...

    Science.gov (United States)

    2013-05-31

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-752] Certain Gaming and Entertainment... importation of certain gaming and entertainment consoles, related software, and components thereof by reason... violation of [[Page 32691

  14. Windows base sodium liquid high-speed measuring system software development

    International Nuclear Information System (INIS)

    Kolokol'tsev, M.V.

    2005-01-01

    This work describes software creation process, that allows to realize data capture from the sodium liquid parameter measuring system, information processing and imaging in the real-time operation mode, retrieval, visualization and documentation of the information in post-startup period as well. Nonstandard decision is described: creation of high-speed data capture system, based on Windows and relatively inexpensive hardware component. Technical description (enterprise classes, interface elements) of the developed and introduced enclosures, realizing data capture and post-startup information visualization are given. (author)

  15. Applying Hypertext Structures to Software Documentation.

    Science.gov (United States)

    French, James C.; And Others

    1997-01-01

    Describes a prototype system for software documentation management called SLEUTH (Software Literacy Enhancing Usefulness to Humans) being developed at the University of Virginia. Highlights include information retrieval techniques, hypertext links that are installed automatically, a WAIS (Wide Area Information Server) search engine, user…

  16. [Identification of Dendrobium varieties by Fourier transform infrared spectroscopy combined with spectral retrieval].

    Science.gov (United States)

    Liu, Fei; Wang, Yuan-zhong; Deng, Xing-yan; Jin, Hang; Yang, Chun-yan

    2014-06-01

    The infrared spectral of stems of 165 trees of 23 Dendrobium varieties were obtained by means of Fourier transform infrared spectroscopy technique. The spectra show that the spectra of all the samples were similar, and the main components of stem of Dendrobium is cellulose. By the spectral professional software Omnic8.0, three spectral databases were constructed. Lib01 includes of the average spectral of the first four trees of every variety, while Lib02 and Lib03 are constructed from the first-derivative spectra and the second-derivative spectra of average spectra, separately. The correlation search, the square difference retrieval and the square differential difference retrieval of the spectra are performed with the spectral database Lib01 in the specified range of 1 800-500 cm(-1), and the yield correct rate of 92.7%, 74.5% and 92.7%, respectively. The square differential difference retrieval of the first-derivative spectra and the second-derivative spectra is carried out with Lib02 and Lib03 in the same specified range 1 800-500 cm(-1), and shows correct rate of 93.9% for the former and 90.3% for the later. The results show that the first-derivative spectral retrieval of square differential difference algorithm is more suitabe for discerning Dendrobium varieties, and FTIR combining with the spectral retrieval method can identify different varieties of Dendrobium, and the correlation retrieval, the square differential retrieval, the first-derivative spectra and second-derivative spectra retrieval in the specified spectral range are effective and simple way of distinguishing different varieties of Dendrobium.

  17. Software for validating parameters retrieved from satellite

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.; Sathe, P.V.; Pankajakshan, T.

    -channel Scanning Microwave Radiometer (MSMR) onboard the Indian satellites Occansat-1 during 1999-2001 were validated using this software as a case study. The program has several added advantages over the conventional method of validation that involves strenuous...

  18. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  19. Automated information retrieval system for radioactivation analysis

    International Nuclear Information System (INIS)

    Lambrev, V.G.; Bochkov, P.E.; Gorokhov, S.A.; Nekrasov, V.V.; Tolstikova, L.I.

    1981-01-01

    An automated information retrieval system for radioactivation analysis has been developed. An ES-1022 computer and a problem-oriented software ''The description information search system'' were used for the purpose. Main aspects and sources of forming the system information fund, characteristics of the information retrieval language of the system are reported and examples of question-answer dialogue are given. Two modes can be used: selective information distribution and retrospective search [ru

  20. IAEA/NDS requirements related to database software

    International Nuclear Information System (INIS)

    Pronyaev, V.; Zerkin, V.

    2001-01-01

    Full text: The Nuclear Data Section of the IAEA disseminates data to the NDS users through Internet or on CD-ROMs and diskettes. OSU Web-server on DEC Alpha with Open VMS and Oracle/DEC DBMS provides via CGI scripts and FORTRAN retrieval programs access to the main nuclear databases supported by the networks of Nuclear Reactions Data Centres and Nuclear Structure and Decay Data Centres (CINDA, EXFOR, ENDF, NSR, ENSDF). For Web-access to data from other libraries and files, hyper-links to the files stored in ASCII text or other formats are used. Databases on CD-ROM are usually provided with some retrieval system. They are distributed in the run-time mode and comply with all license requirements for software used in their development. Although major development work is done now at the PC with MS-Windows and Linux, NDS may not at present, due to some institutional conditions, use these platforms for organization of the Web access to the data. Starting the end of 1999, the NDS, in co-operation with other data centers, began to work out the strategy of migration of main network nuclear data bases onto platforms other than DEC Alpha/Open VMS/DBMS. Because the different co-operating centers have their own preferences for hardware and software, the requirement to provide maximum platform independence for nuclear databases is the most important and desirable feature. This requirement determined some standards for the nuclear database software development. Taking into account the present state and future development, these standards can be formulated as follows: 1. All numerical data (experimental, evaluated, recommended values and their uncertainties) prepared for inclusion in the IAEA/NDS nuclear database should be submitted in the form of the ASCII text files and will be kept at NDS as a master file. 2. Databases with complex structure should be submitted in the form of the files with standard SQL statements describing all its components. All extensions of standard SQL

  1. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    Science.gov (United States)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  2. Development of E-learning Software Based Multiplatform Components

    OpenAIRE

    Salamah, Irma; Ganiardi, M. Aris

    2017-01-01

    E-learning software is a product of information and communication technology used to help dynamic and flexible learning process between teacher and student. The software technology was first used in the development of e-learning software in the form of web applications. The advantages of this technology because of the ease in the development, installation, and distribution of data. Along with advances in mobile/wireless electronics technology, e-learning software is adapted to this technology...

  3. Compounds in dictionary-based Cross-language information retrieval_revised

    Directory of Open Access Journals (Sweden)

    2002-01-01

    Full Text Available Compound words form an important part of natural language. From the cross-lingual information retrieval (CLIR point of view it is important that many natural languages are highly productive with compounds, and translation resources cannot include entries for all compounds. Also, compounds are often content bearing words in a sentence. In Swedish, German and Finnish roughly one tenth of the words in a text prepared for information retrieval purposes are compounds. Important research questions concerning compound handling in dictionary-based cross-language information retrieval are 1 compound splitting into components, 2 normalisation of components, 3 translation of components and 4 query structuring for compounds and their components in the target language. The impact of compound processing on the performance of the cross-language information retrieval process is evaluated in this study and the results indicate that the effect is clearly positive.

  4. High performance gamma measurements of equipment retrieved from Hanford high-level nuclear waste tanks

    International Nuclear Information System (INIS)

    Troyer, G.L.

    1997-01-01

    The cleanup of high level defense nuclear waste at the Hanford site presents several progressive challenges. Among these is the removal and disposal of various components from buried active waste tanks to allow new equipment insertion or hazards mitigation. A unique automated retrieval system at the tank provides for retrieval, high pressure washing, inventory measurement, and containment for disposal. Key to the inventory measurement is a three detector HPGe high performance gamma spectroscopy system capable of recovering data at up to 90% saturation (200,000 counts per second). Data recovery is based on a unique embedded electronic pulser and specialized software to report the inventory. Each of the detectors have different shielding specified through Monte Carlo simulation with the MCNP program. This shielding provides performance over a dynamic range of eight orders of magnitude. System description, calibration issues and operational experiences are discussed

  5. High performance gamma measurements of equipment retrieved from Hanford high-level nuclear waste tanks

    Energy Technology Data Exchange (ETDEWEB)

    Troyer, G.L.

    1997-03-17

    The cleanup of high level defense nuclear waste at the Hanford site presents several progressive challenges. Among these is the removal and disposal of various components from buried active waste tanks to allow new equipment insertion or hazards mitigation. A unique automated retrieval system at the tank provides for retrieval, high pressure washing, inventory measurement, and containment for disposal. Key to the inventory measurement is a three detector HPGe high performance gamma spectroscopy system capable of recovering data at up to 90% saturation (200,000 counts per second). Data recovery is based on a unique embedded electronic pulser and specialized software to report the inventory. Each of the detectors have different shielding specified through Monte Carlo simulation with the MCNP program. This shielding provides performance over a dynamic range of eight orders of magnitude. System description, calibration issues and operational experiences are discussed.

  6. Center for Technology for Advanced Scientific Component Software (TASCS) Consolidated Progress Report July 2006 - March 2009

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, D E; McInnes, L C; Govindaraju, M; Bramley, R; Epperly, T; Kohl, J A; Nieplocha, J; Armstrong, R; Shasharina, S; Sussman, A L; Sottile, M; Damevski, K

    2009-04-14

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  7. Package-based software development

    NARCIS (Netherlands)

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  8. International Inventory of Software Packages in the Information Field.

    Science.gov (United States)

    Keren, Carl, Ed.; Sered, Irina, Ed.

    Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…

  9. EOS MLS Level 2 Data Processing Software Version 3

    Science.gov (United States)

    Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.; hide

    2011-01-01

    This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.

  10. Using neural networks in software repositories

    Science.gov (United States)

    Eichmann, David (Editor); Srinivas, Kankanahalli; Boetticher, G.

    1992-01-01

    The first topic is an exploration of the use of neural network techniques to improve the effectiveness of retrieval in software repositories. The second topic relates to a series of experiments conducted to evaluate the feasibility of using adaptive neural networks as a means of deriving (or more specifically, learning) measures on software. Taken together, these two efforts illuminate a very promising mechanism supporting software infrastructures - one based upon a flexible and responsive technology.

  11. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    International Nuclear Information System (INIS)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves; Kruzelecki, Karol

    2010-01-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  12. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    Energy Technology Data Exchange (ETDEWEB)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves [CERN, CH-1211 Geneva 23, PH Department, SFT Group (Switzerland); Kruzelecki, Karol, E-mail: stefan.roiser@cern.c, E-mail: ana.gaspar@cern.c, E-mail: yves.perrin@cern.c, E-mail: karol.kruzelecki@cern.c [CERN, CH-1211 Geneva 23, PH Department, LBC Group (Switzerland)

    2010-04-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  13. The JPL Library information retrieval system

    Science.gov (United States)

    Walsh, J.

    1975-01-01

    The development, capabilities, and products of the computer-based retrieval system of the Jet Propulsion Laboratory Library are described. The system handles books and documents, produces a book catalog, and provides a machine search capability. Programs and documentation are available to the public through NASA's computer software dissemination program.

  14. Best Entry Points for Structured Document Retrieval - Part I: Characteristics

    DEFF Research Database (Denmark)

    Reid, Jane; Lalmas, Mounia; Finesilver, Karen

    2006-01-01

    Structured document retrieval makes use of document components as the basis of the retrieval process, rather than complete documents. The inherent relationships between these components make it vital to support users' natural browsing behaviour in order to offer effective and efficient access...

  15. Use of Data Base Microcomputer Software in Descriptive Nursing Research

    OpenAIRE

    Chapman, Judy Jean

    1985-01-01

    Data base microcomputer software was used to design a file for data storage and retrieval in a qualitative nursing research project. The needs of 50 breast feeding mothers from birth to four months were studied. One thousand records with descriptive nursing data were entered into the file. The search and retrieval capability of data base software facilitated this qualitative research. The findings will be discussed in three areas: (1) infant concerns, (2) postpartum concerns, and (3) breast c...

  16. Incorporating a Human-Computer Interaction Course into Software Development Curriculums

    Science.gov (United States)

    Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph

    2015-01-01

    Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…

  17. Use of NMR and NMR Prediction Software to Identify Components in Red Bull Energy Drinks

    Science.gov (United States)

    Simpson, Andre J.; Shirzadi, Azadeh; Burrow, Timothy E.; Dicks, Andrew P.; Lefebvre, Brent; Corrin, Tricia

    2009-01-01

    A laboratory experiment designed as part of an upper-level undergraduate analytical chemistry course is described. Students investigate two popular soft drinks (Red Bull Energy Drink and sugar-free Red Bull Energy Drink) by NMR spectroscopy. With assistance of modern NMR prediction software they identify and quantify major components in each…

  18. Recover the story of a component or the determination of the welding residual stresses; Parcourir l`histoire d`un composant ou la determination des contraintes residuelles de soudage

    Energy Technology Data Exchange (ETDEWEB)

    Genette, P. [Electricite de France (EDF), 69 - Villeurbanne (France). Service Etudes et Projets Thermiques et Nucleaires; Dupas, Ph. [Electricite de France, 77 - Moret sur Loing (France). Dept. Mecanique et Technologie des Composants; Waeckel, F. [Electricite de France (EDF), 92 - Clamart (France). Dept. Mecanique et Modeles Numerique

    1998-10-01

    Mechanical components in nuclear power plants can keep track of the welding processes they had undergone before to entrying into service. The memory of these past events can postpone or enhance possible damage phenomena on these components. Nowadays, numerical simulation software, such as the Code ASTER, enable to reproduce numerically these welding processes so that their mechanical consequences be retrieved. (authors)

  19. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  20. PCI bus content-addressable-memory (CAM) implementation on FPGA for pattern recognition/image retrieval in a distributed environment

    Science.gov (United States)

    Megherbi, Dalila B.; Yan, Yin; Tanmay, Parikh; Khoury, Jed; Woods, C. L.

    2004-11-01

    Recently surveillance and Automatic Target Recognition (ATR) applications are increasing as the cost of computing power needed to process the massive amount of information continues to fall. This computing power has been made possible partly by the latest advances in FPGAs and SOPCs. In particular, to design and implement state-of-the-Art electro-optical imaging systems to provide advanced surveillance capabilities, there is a need to integrate several technologies (e.g. telescope, precise optics, cameras, image/compute vision algorithms, which can be geographically distributed or sharing distributed resources) into a programmable system and DSP systems. Additionally, pattern recognition techniques and fast information retrieval, are often important components of intelligent systems. The aim of this work is using embedded FPGA as a fast, configurable and synthesizable search engine in fast image pattern recognition/retrieval in a distributed hardware/software co-design environment. In particular, we propose and show a low cost Content Addressable Memory (CAM)-based distributed embedded FPGA hardware architecture solution with real time recognition capabilities and computing for pattern look-up, pattern recognition, and image retrieval. We show how the distributed CAM-based architecture offers a performance advantage of an order-of-magnitude over RAM-based architecture (Random Access Memory) search for implementing high speed pattern recognition for image retrieval. The methods of designing, implementing, and analyzing the proposed CAM based embedded architecture are described here. Other SOPC solutions/design issues are covered. Finally, experimental results, hardware verification, and performance evaluations using both the Xilinx Virtex-II and the Altera Apex20k are provided to show the potential and power of the proposed method for low cost reconfigurable fast image pattern recognition/retrieval at the hardware/software co-design level.

  1. Methods and Software for Building Bibliographic Data Bases.

    Science.gov (United States)

    Daehn, Ralph M.

    1985-01-01

    This in-depth look at database management systems (DBMS) for microcomputers covers data entry, information retrieval, security, DBMS software and design, and downloading of literature search results. The advantages of in-house systems versus online search vendors are discussed, and specifications of three software packages and 14 sources are…

  2. 76 FR 52970 - In the Matter of Certain Biometric Scanning Devices, Components Thereof, Associated Software, and...

    Science.gov (United States)

    2011-08-24

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-720] In the Matter of Certain Biometric... 17, 2010 based on a complaint filed on May 11, 2010, by Cross Match Technologies, Inc. (``Cross Match... certain biometric scanning devices, components thereof, associated software, and products containing the...

  3. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  4. Core component integration tests for the back-end software sub-system in the ATLAS data acquisition and event filter prototype -1 project

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.; Niculescu, M.; Radu, A.

    2000-01-01

    The ATLAS data acquisition (DAQ) and Event Filter (EF) prototype -1 project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the DAQ. The back-end sub-system includes core components and detector integration components. The core components provide the basic functionality and had priority in terms of time-scale for development in order to have a baseline sub-system that can be used for integration with the data-flow sub-system and event filter. The following components are considered to be the core of the back-end sub-system: - Configuration databases, describe a large number of parameters of the DAQ system architecture, hardware and software components, running modes and status; - Message reporting system (MRS), allows all software components to report messages to other components in the distributed environment; - Information service (IS) allows the information exchange for software components; - Process manager (PMG), performs basic job control of software components (start, stop, monitoring the status); - Run control (RC), controls the data taking activities by coordinating the operations of the DAQ sub-systems, back-end software and external systems. Performance and scalability tests have been made for individual components. The back-end subsystem integration tests bring together all the core components and several trigger/DAQ/detector integration components to simulate the control and configuration of data taking sessions. For back-end integration tests a test plan was provided. The tests have been done using a shell script that goes through different phases as follows: - starting the back-end server processes to initialize communication services and PMG; - launching configuration specific processes via DAQ supervisor as

  5. Advances in model-based software for simulating ultrasonic immersion inspections of metal components

    Science.gov (United States)

    Chiou, Chien-Ping; Margetan, Frank J.; Taylor, Jared L.; Engle, Brady J.; Roberts, Ronald A.

    2018-04-01

    Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was initiated in 2015 to repackage existing research-grade software into user-friendly tools for the rapid estimation of signal-to-noise ratio (SNR) for ultrasonic inspections of metals. The software combines: (1) a Python-based graphical user interface for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signals and backscattered grain noise characteristics. The later makes use the Thompson-Gray measurement model for the response from an internal defect, and the Thompson-Margetan independent scatterer model for backscattered grain noise. This paper, the third in the series [1-2], provides an overview of the ongoing modeling effort with emphasis on recent developments. These include the ability to: (1) treat microstructures where grain size, shape and tilt relative to the incident sound direction can all vary with depth; and (2) simulate C-scans of defect signals in the presence of backscattered grain noise. The simulation software can now treat both normal and oblique-incidence immersion inspections of curved metal components. Both longitudinal and shear-wave inspections are treated. The model transducer can either be planar, spherically-focused, or bi-cylindrically-focused. A calibration (or reference) signal is required and is used to deduce the measurement system efficiency function. This can be "invented" by the software using center frequency and bandwidth information specified by the user, or, alternatively, a measured calibration signal can be used. Defect types include flat-bottomed-hole reference reflectors, and spherical pores and inclusions. Simulation outputs include estimated defect signal amplitudes, root-mean-square values of grain noise amplitudes, and SNR as functions of the depth of the defect within the metal component. At any particular depth, the user can view

  6. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    OpenAIRE

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software ...

  7. Vertical separation of the atmospheric aerosol components by using poliphon retrieval in polarized micro pulse lidar (P-MPL) measurements: case studies of specific climate-relevant aerosol types

    Science.gov (United States)

    Córdoba-Jabonero, Carmen; Sicard, Michaël; Ansmann, Albert; Águila, Ana del; Baars, Holger

    2018-04-01

    POLIPHON (POlarization-LIdar PHOtometer Networking) retrieval consists in the vertical separation of two/three particle components in aerosol mixtures, highlighting their relative contributions in terms of the optical properties and mass concentrations. This method is based on the specific particle linear depolarization ratio given for different types of aerosols, and is applied to the new polarized Micro-Pulse Lidar (P-MPL). Case studies of specific climate-relevant aerosols (dust particles, fire smoke, and pollen aerosols, including a clean case as reference) observed over Barcelona (Spain) are presented in order to evaluate firstly the potential of P-MPLs measurements in combination with POLIPHON for retrieving the vertical separation of those particle components forming aerosol mixtures and their properties.

  8. The prototype readout chain for CBM using the AFCK board and its software components

    Science.gov (United States)

    Loizeau, Pierre-Alain; Emscherman, David; Lehnert, Jörg; Müller, Walter F. J.; Yang, Junfeng

    2015-09-01

    This paper presents a prototype for the readout chain of the Compressed Baryonic Matter (CBM) experiment using the AFCK FPGA board as Data Processing Board (DPB). The components of the readout chain are described, followed by some test setups, all based on different flavors of AFCK-DPB. Details about the functional blocks in the different versions of the DPB firmware are given, followed by a description of the corresponding software elements.

  9. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  10. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  11. More emotional facial expressions during episodic than during semantic autobiographical retrieval.

    Science.gov (United States)

    El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis

    2016-04-01

    There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions (i.e., happy facial expression for memories cued by happy). Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.

  12. BARTTest: Community-Standard Atmospheric Radiative-Transfer and Retrieval Tests

    Science.gov (United States)

    Harrington, Joseph; Himes, Michael D.; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.

    2018-01-01

    Atmospheric radiative transfer (RT) codes are used both to predict planetary and brown-dwarf spectra and in retrieval algorithms to infer atmospheric chemistry, clouds, and thermal structure from observations. Observational plans, theoretical models, and scientific results depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. The community needs a suite of test calculations with analytically, numerically, or at least community-verified results. We therefore present the Bayesian Atmospheric Radiative Transfer Test Suite, or BARTTest. BARTTest has four categories of tests: analytically verified RT tests of simple atmospheres (single line in single layer, line blends, saturation, isothermal, multiple line-list combination, etc.), community-verified RT tests of complex atmospheres, synthetic retrieval tests on simulated data with known answers, and community-verified real-data retrieval tests.BARTTest is open-source software intended for community use and further development. It is available at https://github.com/ExOSPORTS/BARTTest. We propose this test suite as a standard for verifying atmospheric RT and retrieval codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G, NASA Astrophysics Data Analysis Program grant NNX13AF38G, and NASA Exoplanets Research Program grant NNX17AB62G.

  13. Software architecture for the ORNL large-coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX-based data-acquisition system for the International Fusion Superconducting Magnet Test Facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second-generation system that evolved from a PDP-11/60-based system used during the initial phase of facility testing. The VAX-based software represents a layered implementation that provides integrated access to all of the data sources within the system, decoupling end-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring, and disposing data and control parameters for access from the data retrieval software. This paper describes the software architecture and the functionality incorporated into the various layers of the data system

  14. Software architecture for the ORNL large coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX-based data acquisition system for the International Fusion Superconducting Magnet Test Facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second-generation system that evolved from a PDP-11/60-based system used during the initial phase of facility testing. The VAX-based software represents a layered implementation that provides integrated access to all of the data sources within the system, deoupling end-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring and disposing data and control parameters for access from the data retrieval software. This paper describes the software architecture and the functionality incorporated into the various layers of the data system

  15. Design and Implementation of a Media Access Component at Picsearch Using a Rigorous Software Engineering Approach

    OpenAIRE

    Silva, Diego Núñez

    2011-01-01

    With the arrival of a new generation of sophisticated smartphones, possibilities for mobile video usage are presenting exciting new opportunities. This Master's thesis is based on a collaboration with Picsearch to design and implement a software component that enables the company's video services in Android smartphones.

  16. Reliability Analysis and Optimal Release Problem Considering Maintenance Time of Software Components for an Embedded OSS Porting Phase

    Science.gov (United States)

    Tamura, Yoshinobu; Yamada, Shigeru

    OSS (open source software) systems which serve as key components of critical infrastructures in our social life are still ever-expanding now. Especially, embedded OSS systems have been gaining a lot of attention in the embedded system area, i.e., Android, BusyBox, TRON, etc. However, the poor handling of quality problem and customer support prohibit the progress of embedded OSS. Also, it is difficult for developers to assess the reliability and portability of embedded OSS on a single-board computer. In this paper, we propose a method of software reliability assessment based on flexible hazard rates for the embedded OSS. Also, we analyze actual data of software failure-occurrence time-intervals to show numerical examples of software reliability assessment for the embedded OSS. Moreover, we compare the proposed hazard rate model for the embedded OSS with the typical conventional hazard rate models by using the comparison criteria of goodness-of-fit. Furthermore, we discuss the optimal software release problem for the porting-phase based on the total expected software maintenance cost.

  17. EXAFS Phase Retrieval Solution Tracking for Complex Multi-Component System: Synthesized Topological Inverse Computation

    International Nuclear Information System (INIS)

    Lee, Jay Min; Yang, Dong-Seok; Bunker, Grant B

    2013-01-01

    Using the FEFF kernel A(k,r), we describe the inverse computation from χ(k)-data to g(r)-solution in terms of a singularity regularization method based on complete Bayesian statistics process. In this work, we topologically decompose the system-matched invariant projection operators into two distinct types, (A + AA + A) and (AA + AA + ), and achieved Synthesized Topological Inversion Computation (STIC), by employing a 12-operator-closed-loop emulator of the symplectic transformation. This leads to a numerically self-consistent solution as the optimal near-singular regularization parameters are sought, dramatically suppressing instability problems connected with finite precision arithmetic in ill-posed systems. By statistically correlating a pair of measured data, it was feasible to compute an optimal EXAFS phase retrieval solution expressed in terms of the complex-valued χ(k), and this approach was successfully used to determine the optimal g(r) for a complex multi-component system.

  18. Techniques for Soundscape Retrieval and Synthesis

    Science.gov (United States)

    Mechtley, Brandon Michael

    The study of acoustic ecology is concerned with the manner in which life interacts with its environment as mediated through sound. As such, a central focus is that of the soundscape: the acoustic environment as perceived by a listener. This dissertation examines the application of several computational tools in the realms of digital signal processing, multimedia information retrieval, and computer music synthesis to the analysis of the soundscape. Namely, these tools include a) an open source software library, Sirens, which can be used for the segmentation of long environmental field recordings into individual sonic events and compare these events in terms of acoustic content, b) a graph-based retrieval system that can use these measures of acoustic similarity and measures of semantic similarity using the lexical database WordNet to perform both text-based retrieval and automatic annotation of environmental sounds, and c) new techniques for the dynamic, realtime parametric morphing of multiple field recordings, informed by the geographic paths along which they were recorded.

  19. Software architecture for the ORNL large coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX based data acquisition system for the international fusion superconducting magnetic test facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second generation system that evolved from a PDP-11/60 based system used during the initial phase of facility testing. The VAX based software represents a layered implementation that provides integrated access to all of the data sources within the system, decoupling en-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring, and disposing data and control parameters for access from the data retrieval software

  20. Utah Text Retrieval Project

    Energy Technology Data Exchange (ETDEWEB)

    Hollaar, L A

    1983-10-01

    The Utah Text Retrieval project seeks well-engineered solutions to the implementation of large, inexpensive, rapid text information retrieval systems. The project has three major components. Perhaps the best known is the work on the specialized processors, particularly search engines, necessary to achieve the desired performance and cost. The other two concern the user interface to the system and the system's internal structure. The work on user interface development is not only concentrating on the syntax and semantics of the query language, but also on the overall environment the system presents to the user. Environmental enhancements include convenient ways to browse through retrieved documents, access to other information retrieval systems through gateways supporting a common command interface, and interfaces to word processing systems. The system's internal structure is based on a high-level data communications protocol linking the user interface, index processor, search processor, and other system modules. This allows them to be easily distributed in a multi- or specialized-processor configuration. It also allows new modules, such as a knowledge-based query reformulator, to be added. 15 references.

  1. How organisation of architecture documentation affects architectural knowledge retrieval

    NARCIS (Netherlands)

    de Graaf, K.A.; Liang, P.; Tang, A.; Vliet, J.C.

    A common approach to software architecture documentation in industry projects is the use of file-based documents. This approach offers a single-dimensional arrangement of the architectural knowledge. Knowledge retrieval from file-based architecture documentation is efficient if the organisation of

  2. Pybus - A Python Software Bus

    International Nuclear Information System (INIS)

    Lavrijsen, Wim T.L.P.

    2004-01-01

    A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer of component support. This functionality can be presented to the developer in the form of a module, making it very easy to use. This paper describes a Pythonmodule, PyBus, with which the concept of a ''software bus'' can be realized in Python. It demonstrates, within the context of the ATLAS software framework Athena, how PyBus can be used for the installation and (run-time) configuration of software, not necessarily Python modules, from a Python application in a way that is transparent to the end-user

  3. Simulation of the removal of NET internal components with dynamic modeling software

    International Nuclear Information System (INIS)

    Becquet, M.; Crutzen, Y.R.; Farfaletti-Casali, F.

    1989-01-01

    The replacement of the internal plasma-facing components (first-wall and blanket segments) for maintenance or at the end of their lifetime is an important aspect of the design of the next European torus (NET) and of the remote handling procedures. The first phase of development of the design software tool INVDYN (inverse dynamics) is presented, which will allow optimization of the movements of the internal segments during replacement, taking into account inertial effects and structural deformations. A first analysis of the removal of one NET internal segment provides, for a defined trajectory, the required generalized forces that must be applied on the crane system

  4. Exploiting IoT Technologies and Open Source Components for Smart Seismic Network Instrumentation

    Science.gov (United States)

    Germenis, N. G.; Koulamas, C. A.; Foundas, P. N.

    2017-12-01

    The data collection infrastructure of any seismic network poses a number of requirements and trade-offs related to accuracy, reliability, power autonomy and installation & operational costs. Having the right hardware design at the edge of this infrastructure, embedded software running inside the instruments is the heart of pre-processing and communication services implementation and their integration with the central storage and processing facilities of the seismic network. This work demonstrates the feasibility and benefits of exploiting software components from heterogeneous sources in order to realize a smart seismic data logger, achieving higher reliability, faster integration and less development and testing costs of critical functionality that is in turn responsible for the cost and power efficient operation of the device. The instrument's software builds on top of widely used open source components around the Linux kernel with real-time extensions, the core Debian Linux distribution, the earthworm and seiscomp tooling frameworks, as well as components from the Internet of Things (IoT) world, such as the CoAP and MQTT protocols for the signaling planes, besides the widely used de-facto standards of the application domain at the data plane, such as the SeedLink protocol. By using an innovative integration of features based on lower level GPL components of the seiscomp suite with higher level processing earthworm components, coupled with IoT protocol extensions to the latter, the instrument can implement smart functionality such as network controlled, event triggered data transmission in parallel with edge archiving and on demand, short term historical data retrieval.

  5. Status of sorption information retrieval system

    International Nuclear Information System (INIS)

    Hostetler, D.D.; Serne, R.J.; Brandstetter, A.

    1979-09-01

    A Sorption Information Retrieval System (SIRS) is being designed to provide an efficient, computerized, data base for information on radionuclide sorption in geologic media. The data bank will include Kd values for a large number of radionuclides occurring in radioactive wastes originating from the commercial nuclear power industry. Kd values determined to date span several groundwater compositions and a wide variety of rock types and minerals. The data system will not only include Kd values, but also background information on the experiments themselves. This will allow the potential user to retrieve not only the Kd values of interest but also sufficient information to evaluate the accuracy and usefulness of the data. During FY-1979, the logic structure of the system was designed, the software programmed, the data categories selected, and the data format specified. About 40% of the approximately 5000 Kd experiments performed by the Waste Isolation Safety Assessment Program (WISAP) and its subcontractors during FY-1977 and FY-1978 have been evaluated, coded and keypunched. Additional software improvements and system testing are needed before the system will be fully operational. A workshop requested by the NEA was held to discuss potential internatioal participation in the data system

  6. Reliability parameters of distribution networks components

    Energy Technology Data Exchange (ETDEWEB)

    Gono, R.; Kratky, M.; Rusek, S.; Kral, V. [Technical Univ. of Ostrava (Czech Republic)

    2009-03-11

    This paper presented a framework for the retrieval of parameters from various heterogenous power system databases. The framework was designed to transform the heterogenous outage data in a common relational scheme. The framework was used to retrieve outage data parameters from the Czech and Slovak republics in order to demonstrate the scalability of the framework. A reliability computation of the system was computed in 2 phases representing the retrieval of component reliability parameters and the reliability computation. Reliability rates were determined using component reliability and global reliability indices. Input data for the reliability was retrieved from data on equipment operating under similar conditions, while the probability of failure-free operations was evaluated by determining component status. Anomalies in distribution outage data were described as scheme, attribute, and term differences. Input types consisted of input relations; transformation programs; codebooks; and translation tables. The system was used to successfully retrieve data from 7 distributors in the Czech Republic and Slovak Republic between 2000-2007. The database included 301,555 records. Data were queried using SQL language. 29 refs., 2 tabs., 2 figs.

  7. Age-related alterations of brain network underlying the retrieval of emotional autobiographical memories: an fMRI study using independent component analysis.

    Science.gov (United States)

    Ge, Ruiyang; Fu, Yan; Wang, Dahua; Yao, Li; Long, Zhiying

    2014-01-01

    Normal aging has been shown to modulate the neural underpinnings of autobiographical memory and emotion processing. Moreover, previous researches have suggested that aging produces a "positivity effect" in autobiographical memory. Although a few imaging studies have investigated the neural mechanism of the positivity effect, the neural substrates underlying the positivity effect in emotional autobiographical memory is unclear. To understand the age-related neural changes in emotional autobiographical memory that underlie the positivity effect, the present functional magnetic resonance imaging (fMRI) study used the independent component analysis (ICA) method to compare brain networks in younger and older adults as they retrieved positive and negative autobiographical events. Compared to their younger counterparts, older adults reported relatively higher positive feelings when retrieving emotional autobiographical events. Imaging data indicated an age-related reversal within the ventromedial prefrontal/anterior cingulate cortex (VMPFC/ACC) and the left amygdala of the brain networks that were engaged in the retrieval of autobiographical events with different valence. The retrieval of negative events compared to positive events induced stronger activity in the VMPFC/ACC and weaker activity in the amygdala for the older adults, whereas the younger adults showed a reversed pattern. Moreover, activity in the VMPFC/ACC within the task-related networks showed a negative correlation with the emotional valence intensity. These results may suggest that the positivity effect in older adults' autobiographical memories is potentially due to age-related changes in controlled emotional processing implemented by the VMPFC/ACC-amygdala circuit.

  8. Age-related alterations of brain network underlying the retrieval of emotional autobiographical memories: An fMRI study using independent component analysis

    Directory of Open Access Journals (Sweden)

    Ruiyang eGe

    2014-08-01

    Full Text Available Normal aging has been shown to modulate the neural underpinnings of autobiographical memory and emotion processing. Moreover, previous researches have suggested that aging produces a positivity effect in autobiographical memory. Although a few imaging studies have investigated the neural mechanism of the positivity effect, the neural substrates underlying the positivity effect in emotional autobiographical memory is unclear. To understand the age-related neural changes in emotional autobiographical memory that underlie the positivity effect, the present functional magnetic resonance imaging (fMRI study used the independent component analysis (ICA method to compare brain networks in younger and older adults as they retrieved positive and negative autobiographical events. Compared to their younger counterparts, older adults reported relatively higher positive feelings when retrieving emotional autobiographical events. Imaging data indicated an age-related reversal within the ventromedial prefrontal/anterior cingulate cortex (VMPFC/ACC and the left amygdala of the brain networks that were engaged in the retrieval of autobiographical events with different valence. The retrieval of negative events compared to positive events induced stronger activity in the VMPFC/ACC and weaker activity in the amygdala for the older adults, whereas the younger adults showed a reversed pattern. Moreover, activity in the VMPFC/ACC within the task-related networks showed a negative correlation with the emotional valence intensity. These results may suggest that the positivity effect in older adults’ autobiographical memories is potentially due to age-related changes in controlled emotional processing implemented by the VMPFC/ACC-amygdala circuit.

  9. Development and application of analytical methods for the assessment of software-based IC components in German nuclear power plants; Entwicklung und Einsatz von Analysemethoden zur Beurteilung softwarebasierter leittechnischer Einrichtungen in deutschen Kernkraftwerken

    Energy Technology Data Exchange (ETDEWEB)

    Arians, Robert; Arnold, Simone; Blum, Stefanie; Buchholz, Marcel; Lochthofen, Andre; Quester, Claudia; Sommer, Dagmar

    2015-03-15

    In this report, results and data from examinations concerning software-based I and C components are evaluated. As failure modes of software-based components and failure causes differ fundamentally from non-software-based components, an evaluation of the operating experience of such components was carried out. This evaluation should show whether or not existing approaches for non-software-based components can be directly transferred to software-based components, or if a different approach has to be developed. The state of the art and science was gathered and is described for the national as well as the international situation. To include failures in non-safety systems, events not fulfilling the incident reporting criteria of German authorities were also included in this evaluation. The data provided by licensees of six German NPPs (different Boiling Water Reactors and Pressurized Water Reactors) was recorded for at least 8 years. The software-based components used in the NPPs are identified and their operating experience is analyzed in order to identify relevant failure modes and to establish a knowledge base for future failure rating.

  10. Best Entry Points for Structured Document Retrieval - Part II: Types, Usage and Effectiveness

    DEFF Research Database (Denmark)

    Reid, Jane; Lalmas, Mounia; Finesilver, Karen

    2006-01-01

    Structured document retrieval makes use of document components as the basis of the retrieval process, rather than complete documents. The inherent relationships between these components make it vital to support users' natural browsing behaviour in order to offer effective and efficient access...

  11. Algorithms to retrieve optical properties of three component aerosols from two-wavelength backscatter and one-wavelength polarization lidar measurements considering nonsphericity of dust

    International Nuclear Information System (INIS)

    Nishizawa, Tomoaki; Sugimoto, Nobuo; Matsui, Ichiro; Shimizu, Atsushi; Okamoto, Hajime

    2011-01-01

    We developed backward and forward types of algorithms for estimating the vertical profiles of extinction coefficients at 532 nm for three component aerosols (water-soluble, dust, and sea salt) using three-channel Mie-scattering lidar data of the backscatter (β) at 532 and 1064 nm and the depolarization ratio (δ) at 532 nm. While the water-soluble and sea-salt particles were reasonably assumed to be spherical, the dust particles were treated as randomly oriented spheroids to account for their nonsphericity. The introduction of spheroid models enabled us to more effectively use the three-channel data (i.e., 2β+1δ data) and to reduce the uncertainties caused by the assumption of spherical dust particles in our previously developed algorithms. We also performed an extensive sensitivity study to estimate retrieval errors, which showed that the errors in the extinction coefficient for each aerosol component were smaller than 30% (60%) for the backward (forward) algorithm when the measurement errors were ±5%. We demonstrated the ability of the algorithms to partition aerosol layers consisting of three aerosol components by applying them to shipborne lidar data. Comparisons with sky radiometer measurements revealed that the retrieved optical thickness and angstrom exponent of aerosols using the algorithms developed in this paper agreed well with the sky radiometer measurements (within 6%).

  12. Comparative study of material loss at the taper interface in retrieved metal-on-polyethylene and metal-on-metal femoral components from a single manufacturer.

    Science.gov (United States)

    Bills, Paul; Racasan, Radu; Bhattacharya, Saugatta; Blunt, Liam; Isaac, Graham

    2017-08-01

    There have been a number of reports on the occurrence of taper corrosion and/or fretting and some have speculated on a link to the occurrence of adverse local tissue reaction specifically in relation to total hip replacement which have a metal-on-metal bearing. As such a study was carried out to compare the magnitude of material loss at the taper in a series of retrieved femoral heads used in metal-on-polyethylene bearings with that in a series of retrieved heads used in metal-on-metal bearings. A total of 36 metal-on-polyethylene and 21 metal-on-metal femoral components were included in the study all of which were received from a customer complaint database. Furthermore, a total of nine as-manufactured femoral components were included to provide a baseline for characterisation. All taper surfaces were assessed using an established corrosion scoring method and measurements were taken of the female taper surface using a contact profilometry. In the case of metal-on-metal components, the bearing wear was also assessed using coordinate metrology to determine whether or not there was a relationship between bearing and taper material loss in these cases. The study found that in this cohort the median value of metal-on-polyethylene taper loss was 1.25 mm 3 with the consequent median value for metal-on-metal taper loss being 1.75 mm 3 . This study also suggests that manufacturing form can result in an apparent loss of material from the taper surface determined to have a median value of 0.59 mm 3 . Therefore, it is clear that form variability is a significant confounding factor in the measurement of material loss from the tapers of femoral heads retrieved following revision surgery.

  13. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  14. Geospatial metadata retrieval from web services

    Directory of Open Access Journals (Sweden)

    Ivanildo Barbosa

    Full Text Available Nowadays, producers of geospatial data in either raster or vector formats are able to make them available on the World Wide Web by deploying web services that enable users to access and query on those contents even without specific software for geoprocessing. Several providers around the world have deployed instances of WMS (Web Map Service, WFS (Web Feature Service and WCS (Web Coverage Service, all of them specified by the Open Geospatial Consortium (OGC. In consequence, metadata about the available contents can be retrieved to be compared with similar offline datasets from other sources. This paper presents a brief summary and describes the matching process between the specifications for OGC web services (WMS, WFS and WCS and the specifications for metadata required by the ISO 19115 - adopted as reference for several national metadata profiles, including the Brazilian one. This process focuses on retrieving metadata about the identification and data quality packages as well as indicates the directions to retrieve metadata related to other packages. Therefore, users are able to assess whether the provided contents fit to their purposes.

  15. Design implications for task-specific search utilities for retrieval and re-engineering of code

    Science.gov (United States)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  16. Understanding software faults and their role in software reliability modeling

    Science.gov (United States)

    Munson, John C.

    1994-01-01

    regression equation. Since most of the existing metrics have common elements and are linear combinations of these common elements, it seems reasonable to investigate the structure of the underlying common factors or components that make up the raw metrics. The technique we have chosen to use to explore this structure is a procedure called principal components analysis. Principal components analysis is a decomposition technique that may be used to detect and analyze collinearity in software metrics. When confronted with a large number of metrics measuring a single construct, it may be desirable to represent the set by some smaller number of variables that convey all, or most, of the information in the original set. Principal components are linear transformations of a set of random variables that summarize the information contained in the variables. The transformations are chosen so that the first component accounts for the maximal amount of variation of the measures of any possible linear transform; the second component accounts for the maximal amount of residual variation; and so on. The principal components are constructed so that they represent transformed scores on dimensions that are orthogonal. Through the use of principal components analysis, it is possible to have a set of highly related software attributes mapped into a small number of uncorrelated attribute domains. This definitively solves the problem of multi-collinearity in subsequent regression analysis. There are many software metrics in the literature, but principal component analysis reveals that there are few distinct sources of variation, i.e. dimensions, in this set of metrics. It would appear perfectly reasonable to characterize the measurable attributes of a program with a simple function of a small number of orthogonal metrics each of which represents a distinct software attribute domain.

  17. Research of component-based software development approach in RFID field%RFID领域软件构件化开发技术研究

    Institute of Scientific and Technical Information of China (English)

    徐孟娟; 杨威

    2012-01-01

    将软件构件化开发技术应用至RFID领域.基于领域工程的分析方法,对RFID领域内变化性需求进行封装、隔离和抽象,分析出RFID体系架构,提炼出RFID软件构件模型。针对构件的管理,研究了RFID构件的分类方法,提出刻面分类法.并详细描述RFID软件构件分类的刻面及每个刻面的术语空间。%Component-based software development approach had been applied in RFID field in this paper. Based on the analysis method of domain engineering, to encapsulate isolate and abstract the demands within the area of RFID , the RFID software component model had been extracted after its architecture was analyzed. For the management of the component, faceted classification method, a detailed description of the space and each facet terms of RFID software component had been given in the paper.

  18. Software for radioactive wastes database

    International Nuclear Information System (INIS)

    Souza, Eros Viggiano de; Reis, Luiz Carlos Alves

    1996-01-01

    A radioactive waste database was implemented at CDTN in 1991. The objectives are to register and retrieve information about wastes ge in 1991. The objectives are to register and retrieve information about wastes generated and received at the Centre in order to improve the waste management. Since 1995, the database has being reviewed and a software has being developed aiming at processing information in graphical environment (Windows 95 and Windows NT), minimising the possibility of errors and making the users access more friendly. It was also envisaged to ease graphics and reports edition and to make this database available to other CNEN institutes and even to external organizations. (author)

  19. A component-based groupware development methodology

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Ferreira Pires, Luis; van Sinderen, Marten J.

    2000-01-01

    Software development in general and groupware applications in particular can greatly benefit from the reusability and interoperability aspects associated with software components. Component-based software development enables the construction of software artefacts by assembling prefabricated,

  20. PyBus -- A Python Software Bus

    OpenAIRE

    Lavrijsen, W

    2005-01-01

    A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer...

  1. Retrieval and validation of MetOp/IASI methane

    Directory of Open Access Journals (Sweden)

    E. De Wachter

    2017-12-01

    Full Text Available A new IASI methane product developed at the Royal Belgian Institute for Space Aeronomy (BIRA-IASB is presented. The retrievals are performed with the ASIMUT-ALVL software based on the optimal estimation method (OEM. This paper gives an overview of the forward model and retrieval concept. The usefulness of reconstructed principal component compressed (PCC radiances is highlighted. The information content study carried out in this paper shows that most IASI pixels contain between 0.9 and 1.6 independent pieces of information about the vertical distribution of CH4, with a good sensitivity in the mid- to upper troposphere. A detailed error analysis was performed. The total uncertainty is estimated to be 3.73 % for a CH4 partial column between 4 and 17 km. An extended validation with ground-based CH4 observations at 10 locations was carried out. IASI CH4 partial columns are found to correlate well with the ground-based data for 6 out of the 10 Fourier transform infrared (FTIR stations with correlation coefficients between 0.60 and 0.84. Relative mean differences between IASI and FTIR CH4 range between −2.31 and 4.04 % and are within the systematic uncertainty. For 6 out of the 10 stations the relative mean differences are smaller than ±1 %. The standard deviation of the difference lies between 1.76 and 2.97 % for all the stations.

  2. Determinants to trigger memory reconsolidation: The role of retrieval and updating information.

    Science.gov (United States)

    Rodriguez-Ortiz, Carlos J; Bermúdez-Rattoni, Federico

    2017-07-01

    Long-term memories can undergo destabilization/restabilization processes, collectively called reconsolidation. However, the parameters that trigger memory reconsolidation are poorly understood and are a matter of intense investigation. Particularly, memory retrieval is widely held as requisite to initiate reconsolidation. This assumption makes sense since only relevant cues will induce reconsolidation of a specific memory. However, recent studies show that pharmacological inhibition of retrieval does not avoid memory from undergoing reconsolidation, indicating that memory reconsolidation occurs through a process that can be dissociated from retrieval. We propose that retrieval is not a unitary process but has two dissociable components; one leading to the expression of memory and the other to reconsolidation, referred herein as executer and integrator respectively. The executer would lead to the behavioral expression of the memory. This component would be the one disrupted on the studies that show reconsolidation independence from retrieval. The integrator would deal with reconsolidation. This component of retrieval would lead to long-term memory destabilization when specific conditions are met. We think that an important number of reports are consistent with the hypothesis that reconsolidation is only initiated when updating information is acquired. We suggest that the integrator would initiate reconsolidation to integrate updating information into long-term memory. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Age-related differences in working memory updating components.

    Science.gov (United States)

    Linares, Rocío; Bajo, M Teresa; Pelegrina, Santiago

    2016-07-01

    The aim of this study was to investigate possible age-related changes throughout childhood and adolescence in different component processes of working memory updating (WMU): retrieval, transformation, and substitution. A set of numerical WMU tasks was administered to four age groups (8-, 11-, 14-, and 21-year-olds). To isolate the effect of each of the WMU components, participants performed different versions of a task that included different combinations of the WMU components. The results showed an expected overall decrease in response times and an increase in accuracy performance with age. Most important, specific age-related changes in the retrieval component were found, demonstrating that the effect of retrieval on accuracy was larger in children than in adolescents or young adults. These findings indicate that the availability of representations from outside the focus of attention may change with age. Thus, the retrieval component of updating could contribute to the age-related changes observed in the performance of many updating tasks. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. An interactive end-user software application for a deep-sea photographic database

    Digital Repository Service at National Institute of Oceanography (India)

    Jaisankar, S.; Sharma, R.

    . The software is the first of its kind in deep-sea applications and it also attempts to educate the user about deep-sea photography. The application software is developed by modifying established routines and by creating new routines to save the retrieved...

  5. DOE Integrated Safeguards and Security (DISS) historical document archival and retrieval analysis, requirements and recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Guyer, H.B.; McChesney, C.A.

    1994-10-07

    The overall primary Objective of HDAR is to create a repository of historical personnel security documents and provide the functionality needed for archival and retrieval use by other software modules and application users of the DISS/ET system. The software product to be produced from this specification is the Historical Document Archival and Retrieval Subsystem The product will provide the functionality to capture, retrieve and manage documents currently contained in the personnel security folders in DOE Operations Offices vaults at various locations across the United States. The long-term plan for DISS/ET includes the requirement to allow for capture and storage of arbitrary, currently undefined, clearance-related documents that fall outside the scope of the ``cradle-to-grave`` electronic processing provided by DISS/ET. However, this requirement is not within the scope of the requirements specified in this document.

  6. Development of nuclear reaction data retrieval system on Meme media

    International Nuclear Information System (INIS)

    Ohbayasi, Yosihide; Masui, Hiroshi; Aoyama, Shigeyoshi; Kato, Kiyoshi; Chiba, Masaki

    2000-01-01

    A newly designed retrieval system of charged particle nuclear reaction data is developed on Meme media architecture. We designed the network-based (client-server) retrieval system. The server system is constructed on a UNIX workstation with a relational database, and the client system is constructed on Microsoft Windows PC using an IntelligentPad software package. The IntelligentPad is currently available as developing Meme media. We will develop the system to realize effective utilization of nuclear reaction data: I. 'Re-production, Re-edit, Re-use', II. 'Circulation, Coordination and Evolution', III. 'Knowledge discovery'. (author)

  7. The astronaut and the banana peel: An EVA retriever scenario

    Science.gov (United States)

    Shapiro, Daniel G.

    1989-01-01

    To prepare for the problem of accidents in Space Station activities, the Extravehicular Activity Retriever (EVAR) robot is being constructed, whose purpose is to retrieve astronauts and tools that float free of the Space Station. Advanced Decision Systems is at the beginning of a project to develop research software capable of guiding EVAR through the retrieval process. This involves addressing problems in machine vision, dexterous manipulation, real time construction of programs via speech input, and reactive execution of plans despite the mishaps and unexpected conditions that arise in uncontrolled domains. The problem analysis phase of this work is presented. An EVAR scenario is used to elucidate major domain and technical problems. An overview of the technical approach to prototyping an EVAR system is also presented.

  8. Software Reuse Within the Earth Science Community

    Science.gov (United States)

    Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.

    2006-01-01

    Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very

  9. A passage retrieval method based on probabilistic information retrieval model and UMLS concepts in biomedical question answering.

    Science.gov (United States)

    Sarrouti, Mourad; Ouatik El Alaoui, Said

    2017-04-01

    Passage retrieval, the identification of top-ranked passages that may contain the answer for a given biomedical question, is a crucial component for any biomedical question answering (QA) system. Passage retrieval in open-domain QA is a longstanding challenge widely studied over the last decades. However, it still requires further efforts in biomedical QA. In this paper, we present a new biomedical passage retrieval method based on Stanford CoreNLP sentence/passage length, probabilistic information retrieval (IR) model and UMLS concepts. In the proposed method, we first use our document retrieval system based on PubMed search engine and UMLS similarity to retrieve relevant documents to a given biomedical question. We then take the abstracts from the retrieved documents and use Stanford CoreNLP for sentence splitter to make a set of sentences, i.e., candidate passages. Using stemmed words and UMLS concepts as features for the BM25 model, we finally compute the similarity scores between the biomedical question and each of the candidate passages and keep the N top-ranked ones. Experimental evaluations performed on large standard datasets, provided by the BioASQ challenge, show that the proposed method achieves good performances compared with the current state-of-the-art methods. The proposed method significantly outperforms the current state-of-the-art methods by an average of 6.84% in terms of mean average precision (MAP). We have proposed an efficient passage retrieval method which can be used to retrieve relevant passages in biomedical QA systems with high mean average precision. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. The migration of femoral components after total hip replacement surgery: accuracy and precision of software-aided measurements

    International Nuclear Information System (INIS)

    Decking, J.; Schuetz, U.; Decking, R.; Puhl, W.

    2003-01-01

    Objective: To assess the accuracy and precision of a software-aided system to measure migration of femoral components after total hip replacement (THR) on digitised radiographs. Design and patients: Subsidence and varus-valgus tilt of THR stems within the femur were measured on digitised anteroposterior pelvic radiographs. The measuring software (UMA, GEMED, Germany) relies on bony landmarks and comparability parameters of two consecutive radiographs. Its accuracy and precision were calculated by comparing it with the gold standard in migration measurements, radiostereometric analysis (RSA). Radiographs and corresponding RSA measurements were performed in 60 patients (38-69 years) following cementless THR surgery. Results and conclusions: The UMA software measured the subsidence of the stems with an accuracy of ±2.5 mm and varus-valgus tilt with an accuracy of ±1.8 (95% confidence interval). A good interobserver and intraobserver reliability was calculated with Cronbach's alpha ranging from 0.86 to 0.97. Measuring the subsidence of THR stems within the femur is an important parameter in the diagnosis of implant loosening. Software systems such as UMA improve the accuracy of migration measurements and are easy to use on routinely performed radiographs of operated hip joints. (orig.)

  11. Experimental evaluation of ontology-based HIV/AIDS frequently asked question retrieval system.

    Science.gov (United States)

    Ayalew, Yirsaw; Moeng, Barbara; Mosweunyane, Gontlafetse

    2018-05-01

    This study presents the results of experimental evaluations of an ontology-based frequently asked question retrieval system in the domain of HIV and AIDS. The main purpose of the system is to provide answers to questions on HIV/AIDS using ontology. To evaluate the effectiveness of the frequently asked question retrieval system, we conducted two experiments. The first experiment focused on the evaluation of the quality of the ontology we developed using the OQuaRE evaluation framework which is based on software quality metrics and metrics designed for ontology quality evaluation. The second experiment focused on evaluating the effectiveness of the ontology in retrieving relevant answers. For this we used an open-source information retrieval platform, Terrier, with retrieval models BM25 and PL2. For the measurement of performance, we used the measures mean average precision, mean reciprocal rank, and precision at 5. The results suggest that frequently asked question retrieval with ontology is more effective than frequently asked question retrieval without ontology in the domain of HIV/AIDS.

  12. Optimization of Calcine Blending During Retrieval from Binsets

    International Nuclear Information System (INIS)

    Nelson, Lee Orville; Mohr, Charles Milton; Taylor, Dean Dalton

    2000-01-01

    This report documents a study performed during advanced feasibility studies for the INTEC Technology Development Facility (ITDF). The study was commissioned to provide information about functional requirements for the ITDF related to development of equipment and procedures for retrieving radioactive calcine from binset storage at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Engineering and Environmental Laboratory (INEEL). Calcine will be retrieved prior to treating it for permanent disposal in a national repository for high level waste. The objective this study was to estimate the degree of homogenization of the calcine that might be achieved through optimized retrieval and subsequent blending. Such homogenization has the potential of reducing the costs for treatment of the calcine and for qualifying of the final waste forms for acceptance at the repository. Results from the study indicate that optimized retrieval and blending can reduce the peak concentration variations of key components (Al, Zr, F) in blended batches of retrieved calcine. During un-optimized retrieval these variations are likely to be 81-138% while optimized retrieval can reduce them to the 5-10% range

  13. Component-based development process and component lifecycle

    NARCIS (Netherlands)

    Crnkovic, I.; Chaudron, M.R.V.; Larsson, S.

    2006-01-01

    The process of component- and component-based system development differs in many significant ways from the "classical" development process of software systems. The main difference is in the separation of the development process of components from the development process of systems. This fact has a

  14. A Preliminary ZEUS Lightning Location Error Analysis Using a Modified Retrieval Theory

    Science.gov (United States)

    Elander, Valjean; Koshak, William; Phanord, Dieudonne

    2004-01-01

    The ZEUS long-range VLF arrival time difference lightning detection network now covers both Europe and Africa, and there are plans for further expansion into the western hemisphere. In order to fully optimize and assess ZEUS lightning location retrieval errors and to determine the best placement of future receivers expected to be added to the network, a software package is being developed jointly between the NASA Marshall Space Flight Center (MSFC) and the University of Nevada Las Vegas (UNLV). The software package, called the ZEUS Error Analysis for Lightning (ZEAL), will be used to obtain global scale lightning location retrieval error maps using both a Monte Carlo approach and chi-squared curvature matrix theory. At the core of ZEAL will be an implementation of an Iterative Oblate (IO) lightning location retrieval method recently developed at MSFC. The IO method will be appropriately modified to account for variable wave propagation speed, and the new retrieval results will be compared with the current ZEUS retrieval algorithm to assess potential improvements. In this preliminary ZEAL work effort, we defined 5000 source locations evenly distributed across the Earth. We then used the existing (as well as potential future ZEUS sites) to simulate arrival time data between source and ZEUS site. A total of 100 sources were considered at each of the 5000 locations, and timing errors were selected from a normal distribution having a mean of 0 seconds and a standard deviation of 20 microseconds. This simulated "noisy" dataset was analyzed using the IO algorithm to estimate source locations. The exact locations were compared with the retrieved locations, and the results are summarized via several color-coded "error maps."

  15. A Prototype of an Intelligent System for Information Retrieval: IOTA.

    Science.gov (United States)

    Chiaramella, Y.; Defude, B.

    1987-01-01

    Discusses expert systems and their value as components of information retrieval systems related to semantic inference, and describes IOTA, a model of an intelligent information retrieval system which emphasizes natural language query processing. Experimental results are discussed and current and future developments are highlighted. (Author/LRW)

  16. Asteroid retrieval missions enabled by invariant manifold dynamics

    Science.gov (United States)

    Sánchez, Joan Pau; García Yárnoz, Daniel

    2016-10-01

    Near Earth Asteroids are attractive targets for new space missions; firstly, because of their scientific importance, but also because of their impact threat and prospective resources. The asteroid retrieval mission concept has thus arisen as a synergistic approach to tackle these three facets of interest in one single mission. This paper reviews the methodology used by the authors (2013) in a previous search for objects that could be transported from accessible heliocentric orbits into the Earth's neighbourhood at affordable costs (or Easily Retrievable Objects, a.k.a. EROs). This methodology consisted of a heuristic pruning and an impulsive manoeuvre trajectory optimisation. Low thrust propulsion on the other hand clearly enables the transportation of much larger objects due to its higher specific impulse. Hence, in this paper, low thrust retrieval transfers are sought using impulsive trajectories as first guesses to solve the optimal control problem. GPOPS-II is used to transcribe the continuous-time optimal control problem to a nonlinear programming problem (NLP). The latter is solved by IPOPT, an open source software package for large-scale NLPs. Finally, a natural continuation procedure that increases the asteroid mass allows to find out the largest objects that could be retrieved from a given asteroid orbit. If this retrievable mass is larger than the actual mass of the asteroid, the asteroid retrieval mission for this particular object is said to be feasible. The paper concludes with an updated list of 17 EROs, as of April 2016, with their maximum retrievable masses by means of low thrust propulsion. This ranges from 2000 tons for the easiest object to be retrieved to 300 tons for the least accessible of them.

  17. Service oriented architecture assessment based on software components

    Directory of Open Access Journals (Sweden)

    Mahnaz Amirpour

    2016-01-01

    Full Text Available Enterprise architecture, with detailed descriptions of the functions of information technology in the organization, tries to reduce the complexity of technology applications resulting in tools with greater efficiency in achieving the objectives of the organization. Enterprise architecture consists of a set of models describing this technology in different components performance as well as various aspects of the applications in any organization. Therefore, information technology development and maintenance management can perform well within organizations. This study aims to suggest a method to identify different types of services in service-oriented architecture analysis step that applies some previous approaches in an integrated form and, based on the principles of software engineering, to provide a simpler and more transparent approach through the expression of analysis details. Advantages and disadvantages of proposals should be evaluated before the implementation and costs allocation. Evaluation methods can better identify strengths and weaknesses of the current situation apart from selecting appropriate model out of several suggestions, and clarify this technology development solution for organizations in the future. We will be able to simulate data and processes flow within the organization by converting the output of the model to colored Petri nets and evaluate and test it by examining various inputs to enterprise architecture before implemented in terms of reliability and response time. A model of application has been studied for the proposed model and the results can describe and design architecture for data.

  18. Analysis of Dual Mobility Liner Rim Damage Using Retrieved Components and Cadaver Models.

    Science.gov (United States)

    Nebergall, Audrey K; Freiberg, Andrew A; Greene, Meridith E; Malchau, Henrik; Muratoglu, Orhun; Rowell, Shannon; Zumbrunn, Thomas; Varadarajan, Kartik M

    2016-07-01

    The objective of this study was to assess the retentive rim of retrieved dual mobility liners for visible evidence of deformation from femoral neck contact and to use cadaver models to determine if anterior soft tissue impingement could contribute to such deformation. Fifteen surgically retrieved polyethylene liners were assessed for evidence of rim deformation. The average time in vivo was 31.4 months, and all patients were revised for reasons other than intraprosthetic dislocation. Liner interaction with the iliopsoas was studied visually and with fluoroscopy in cadaver specimens using a dual mobility system different than the retrieval study. For fluoroscopic visualization, a metal wire was sutured to the iliopsoas and wires were also embedded into grooves on the outer surface of the liner and the inner head. All retrievals showed evidence of femoral neck contact. The cadaver experiments showed that liner motion was impeded by impingement with the iliopsoas tendon in low flexion angles. When observing the hip during maximum hyperextension, 0°, 15°, and 30° of flexion, there was noticeable tenting of the iliopsoas caused by impingement with the liner. Liner rim deformation resulting from contact with the femoral neck likely begins during early in vivo function. The presence of deformation is indicative of a mechanism inhibiting mobility of the liner. The cadaver studies showed that liner motion could be impeded because of its impingement with the iliopsoas. Such soft tissue impingement may be one mechanism by which liner motion is routinely inhibited, which can result in load transfer from the neck to the rim. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. The effects of gender on the retrieval of episodic and semantic components of autobiographical memory.

    Science.gov (United States)

    Fuentes, Amanda; Desrocher, Mary

    2013-01-01

    Despite consistent evidence that women exhibit greater episodic memory specificity than men, little attention has been paid to gender differences in the production of episodic details during autobiographical recall under conditions of high and low retrieval support. Similarly the role of gender on the production of semantic details used to support autobiographical memory recollections of specific events has been largely unexplored. In the present study an undergraduate sample of 50 men and 50 women were assessed using the Autobiographical Interview (Levine, Svoboda, Hay, Winocur, & Moscovitch, 2002). Women recalled more episodic information compared to men in the high retrieval support condition, whereas no gender differences were found in the low retrieval support condition. In addition, women produced more repetitions compared to men in the high retrieval support condition. No gender differences were found in the production of semantic details. These results are interpreted in terms of gender differences in encoding and reminiscence practices. This research adds to the literature on gender differences in memory recall and suggests that gender is an important variable in explaining individual differences in AM recall.

  20. Learning to rank for information retrieval

    CERN Document Server

    Liu, Tie-Yan

    2011-01-01

    Due to the fast growth of the Web and the difficulties in finding desired information, efficient and effective information retrieval systems have become more important than ever, and the search engine has become an essential tool for many people. The ranker, a central component in every search engine, is responsible for the matching between processed queries and indexed documents. Because of its central role, great attention has been paid to the research and development of ranking technologies. In addition, ranking is also pivotal for many other information retrieval applications, such as coll

  1. InterFace: A software package for face image warping, averaging, and principal components analysis.

    Science.gov (United States)

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  2. Evaluation of a Linear Mixing Model to Retrieve Soil and Vegetation Temperatures of Land Targets

    International Nuclear Information System (INIS)

    Yang, Jinxin; Jia, Li; Cui, Yaokui; Zhou, Jie; Menenti, Massimo

    2014-01-01

    A simple linear mixing model of heterogeneous soil-vegetation system and retrieval of component temperatures from directional remote sensing measurements by inverting this model is evaluated in this paper using observations by a thermal camera. The thermal camera was used to obtain multi-angular TIR (Thermal Infra-Red) images over vegetable and orchard canopies. A whole thermal camera image was treated as a pixel of a satellite image to evaluate the model with the two-component system, i.e. soil and vegetation. The evaluation included two parts: evaluation of the linear mixing model and evaluation of the inversion of the model to retrieve component temperatures. For evaluation of the linear mixing model, the RMSE is 0.2 K between the observed and modelled brightness temperatures, which indicates that the linear mixing model works well under most conditions. For evaluation of the model inversion, the RMSE between the model retrieved and the observed vegetation temperatures is 1.6K, correspondingly, the RMSE between the observed and retrieved soil temperatures is 2.0K. According to the evaluation of the sensitivity of retrieved component temperatures on fractional cover, the linear mixing model gives more accurate retrieval accuracies for both soil and vegetation temperatures under intermediate fractional cover conditions

  3. The artifacts of component-based development

    International Nuclear Information System (INIS)

    Rizwan, M.; Qureshi, J.; Hayat, S.A.

    2007-01-01

    Component based development idea was floated in a conference name Mass Produced Software Components in 1968 (1). Since then engineering and scientific libraries are developed to reuse the previously developed functions. This concept is now widely used in SW development as component based development (CBD). Component-based software engineering (CBSE) is used to develop/ assemble software from existing components (2). Software developed using components is called component where (3). This paper presents different architectures of CBD such as Active X, common object request broker architecture (CORBA), remote method invocation (RMI) and simple object access protocol (SOAP). The overall objective of this paper is to support the practice of CBD by comparing its advantages and disadvantages. This paper also evaluates object oriented process model to adapt it for CBD. (author)

  4. System Risk Balancing Profiles: Software Component

    Science.gov (United States)

    Kelly, John C.; Sigal, Burton C.; Gindorf, Tom

    2000-01-01

    The Software QA / V&V guide will be reviewed and updated based on feedback from NASA organizations and others with a vested interest in this area. Hardware, EEE Parts, Reliability, and Systems Safety are a sample of the future guides that will be developed. Cost Estimates, Lessons Learned, Probability of Failure and PACTS (Prevention, Avoidance, Control or Test) are needed to provide a more complete risk management strategy. This approach to risk management is designed to help balance the resources and program content for risk reduction for NASA's changing environment.

  5. An adaptive neuro fuzzy model for estimating the reliability of component-based software systems

    Directory of Open Access Journals (Sweden)

    Kirti Tyagi

    2014-01-01

    Full Text Available Although many algorithms and techniques have been developed for estimating the reliability of component-based software systems (CBSSs, much more research is needed. Accurate estimation of the reliability of a CBSS is difficult because it depends on two factors: component reliability and glue code reliability. Moreover, reliability is a real-world phenomenon with many associated real-time problems. Soft computing techniques can help to solve problems whose solutions are uncertain or unpredictable. A number of soft computing approaches for estimating CBSS reliability have been proposed. These techniques learn from the past and capture existing patterns in data. The two basic elements of soft computing are neural networks and fuzzy logic. In this paper, we propose a model for estimating CBSS reliability, known as an adaptive neuro fuzzy inference system (ANFIS, that is based on these two basic elements of soft computing, and we compare its performance with that of a plain FIS (fuzzy inference system for different data sets.

  6. Machine Learning or Information Retrieval Techniques for Bug Triaging: Which is better?

    Directory of Open Access Journals (Sweden)

    Anjali Goyal

    2017-07-01

    Full Text Available Bugs are the inevitable part of a software system. Nowadays, large software development projects even release beta versions of their products to gather bug reports from users. The collected bug reports are then worked upon by various developers in order to resolve the defects and make the final software product more reliable. The high frequency of incoming bugs makes the bug handling a difficult and time consuming task. Bug assignment is an integral part of bug triaging that aims at the process of assigning a suitable developer for the reported bug who corrects the source code in order to resolve the bug. There are various semi and fully automated techniques to ease the task of bug assignment. This paper presents the current state of the art of various techniques used for bug report assignment. Through exhaustive research, the authors have observed that machine learning and information retrieval based bug assignment approaches are most popular in literature. A deeper investigation has shown that the trend of techniques is taking a shift from machine learning based approaches towards information retrieval based approaches. Therefore, the focus of this work is to find the reason behind the observed drift and thus a comparative analysis is conducted on the bug reports of the Mozilla, Eclipse, Gnome and Open Office projects in the Bugzilla repository. The results of the study show that the information retrieval based technique yields better efficiency in recommending the developers for bug reports.

  7. Optimization of Calcine Blending During Retrieval From Binsets

    International Nuclear Information System (INIS)

    Taylor, D.D.; Mohr, C.M.; Nelson, L.O.

    2000-01-01

    This report documents a study performed during advanced feasibility studies for the INTEC Technology Development Facility (ITDF). The study was commissioned to provide information about functional requirements for the ITDF related to development of equipment and procedures for retrieving radioactive calcine from binset storage at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Engineering and Environmental Laboratory (INEEL). Calcine will be retrieved prior to treating it for permanent disposal in a national repository for high level waste. The objective this study was to estimate the degree of homogenization of the calcine that might be achieved through optimized retrieval and subsequent blending. Such homogenization has the potential of reducing the costs for treatment of the calcine and for qualifying of the final waste forms for acceptance at the repository. Results from the study indicate that optimized retrieval and blending can reduce the peak c oncentration variations of key components (Al, Zr, F) in blended batches of retrieved calcine. During un-optimized retrieval these variations are likely to be 81-138% while optimized retrieval can reduce them to the 5-10% range

  8. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  9. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  10. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  11. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  12. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  13. Information retrieval models foundations and relationships

    CERN Document Server

    Roelleke, Thomas

    2013-01-01

    Information Retrieval (IR) models are a core component of IR research and IR systems. The past decade brought a consolidation of the family of IR models, which by 2000 consisted of relatively isolated views on TF-IDF (Term-Frequency times Inverse-Document-Frequency) as the weighting scheme in the vector-space model (VSM), the probabilistic relevance framework (PRF), the binary independence retrieval (BIR) model, BM25 (Best-Match Version 25, the main instantiation of the PRF/BIR), and language modelling (LM). Also, the early 2000s saw the arrival of divergence from randomness (DFR).Regarding in

  14. Memory retrieval of everyday information under stress.

    Science.gov (United States)

    Stock, Lisa-Marie; Merz, Christian J

    2018-07-01

    Psychosocial stress is known to crucially influence learning and memory processes. Several studies have already shown an impairing effect of elevated cortisol concentrations on memory retrieval. These studies mainly used learning material consisting of stimuli with a limited ecological validity. When using material with a social contextual component or with educational relevant material both impairing and enhancing stress effects on memory retrieval could be observed. In line with these latter studies, the present experiment also used material with a higher ecological validity (a coherent text consisting of daily relevant numeric, figural and verbal information). After encoding, retrieval took place 24 h later after exposure to psychosocial stress or a control procedure (20 healthy men per group). The stress group was further subdivided into cortisol responders and non-responders. Results showed a significantly impaired retrieval of everyday information in non-responders compared to responders and controls. Altogether, the present findings indicate the need of an appropriate cortisol response for the successful memory retrieval of everyday information. Thus, the present findings suggest that cortisol increases - contrary to a stressful experience per se - seem to play a protective role for retrieving everyday information. Additionally, it could be speculated that the previously reported impairing stress effects on memory retrieval might depend on the used learning material. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. A Constructive Approach To Software Evolution

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Aksit, Mehmet

    2007-01-01

    In many software design and evaluation techniques, either the software evolution problem is not systematically elaborated, or only the impact of evolution is considered. Thus, most of the time software is changed by editing the components of the software system, i.e. breaking down the software

  16. Protein Annotators' Assistant: A Novel Application of Information Retrieval Techniques.

    Science.gov (United States)

    Wise, Michael J.

    2000-01-01

    Protein Annotators' Assistant (PAA) is a software system which assists protein annotators in assigning functions to newly sequenced proteins. PAA employs a number of information retrieval techniques in a novel setting and is thus related to text categorization, where multiple categories may be suggested, except that in this case none of the…

  17. OCRWM procedure for reporting software baseline change information

    International Nuclear Information System (INIS)

    1994-07-01

    The purpose of this procedure is to establish a requirement and method for participant organizations to report software baseline change information to the M ampersand O Configuration Management (CM) organization for inclusion in the OCRWM Configuration Information System (CIS). (The requirements for performing software configuration management (SCM) are found in the OCRWM Quality Assurance Requirements and Description (QARD) document and in applicable DOE orders, and not in this procedure.) This procedure provides a linkage between each participant's SCM system and the CIS, which may be accessed for identification, descriptive, and contact information pertaining to software released by a participant. Such information from the CIS will enable retrieval of details and copies of software code and documentation from the participant SCM system

  18. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  19. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  20. Imagery and retrieval of auditory and visual information: neural correlates of successful and unsuccessful performance.

    NARCIS (Netherlands)

    Huijbers, W.; Pennartz, C.M.A.; Rubin, D.C.; Daselaar, S.M.

    2011-01-01

    Remembering past events - or episodic retrieval - consists of several components. There is evidence that mental imagery plays an important role in retrieval and that the brain regions supporting imagery overlap with those supporting retrieval. An open issue is to what extent these regions support

  1. Imagery and retrieval of auditory and visual information: Neural correlates of successful and unsuccessful performance

    NARCIS (Netherlands)

    Huijbers, W.; Pennartz, C.M.A.; Rubin, D.C.; Daselaar, S.M.

    2011-01-01

    Remembering past events - or episodic retrieval - consists of several components. There is evidence that mental imagery plays an important role in retrieval and that the brain regions supporting imagery overlap with those supporting retrieval. An open issue is to what extent these regions support

  2. Software Engineering Issues for Cyber-Physical Systems

    DEFF Research Database (Denmark)

    Al-Jaroodi, Jameela; Mohamed, Nader; Jawhar, Imad

    2016-01-01

    step; however, designing and implementing the right software to integrate and use them effectively is essential. The software facilitates better interfaces, more control and adds smart services, high flexibility and many other added values and features to the CPS. However, software development for CPS......Cyber-Physical Systems (CPS) provide many smart features for enhancing physical processes. These systems are designed with a set of distributed hardware, software, and network components that are embedded in physical systems and environments or attached to humans. Together they function seamlessly...... to offer specific functionalities or features that help enhance human lives, operations or environments. While different CPS components play important roles in a successful CPS development, the software plays the most important role among them. Acquiring and using high quality CPS components is the first...

  3. When your face describes your memories: facial expressions during retrieval of autobiographical memories.

    Science.gov (United States)

    El Haj, Mohamad; Daoudi, Mohamed; Gallouj, Karim; Moustafa, Ahmed A; Nandrino, Jean-Louis

    2018-05-11

    Thanks to the current advances in the software analysis of facial expressions, there is a burgeoning interest in understanding emotional facial expressions observed during the retrieval of autobiographical memories. This review describes the research on facial expressions during autobiographical retrieval showing distinct emotional facial expressions according to the characteristics of retrieved memoires. More specifically, this research demonstrates that the retrieval of emotional memories can trigger corresponding emotional facial expressions (e.g. positive memories may trigger positive facial expressions). Also, this study demonstrates the variations of facial expressions according to specificity, self-relevance, or past versus future direction of memory construction. Besides linking research on facial expressions during autobiographical retrieval to cognitive and affective characteristics of autobiographical memory in general, this review positions this research within the broader context research on the physiologic characteristics of autobiographical retrieval. We also provide several perspectives for clinical studies to investigate facial expressions in populations with deficits in autobiographical memory (e.g. whether autobiographical overgenerality in neurologic and psychiatric populations may trigger few emotional facial expressions). In sum, this review paper demonstrates how the evaluation of facial expressions during autobiographical retrieval may help understand the functioning and dysfunctioning of autobiographical memory.

  4. Video Retrieval Berdasarkan Teks dan Gambar

    Directory of Open Access Journals (Sweden)

    Rahmi Hidayati

    2013-01-01

    Abstract Retrieval video has been used to search a video based on the query entered by user which were text and image. This system could increase the searching ability on video browsing and expected to reduce the video’s retrieval time. The research purposes were designing and creating a software application of retrieval video based on the text and image on the video. The index process for the text is tokenizing, filtering (stopword, stemming. The results of stemming to saved in the text index table. Index process for the image is to create an image color histogram and compute the mean and standard deviation at each primary color red, green and blue (RGB of each image. The results of feature extraction is stored in the image table The process of video retrieval using the query text, images or both. To text query system to process the text query by looking at the text index tables. If there is a text query on the index table system will display information of the video according to the text query. To image query system to process the image query by finding the value of the feature extraction means red, green means, means blue, red standard deviation, standard deviation and standard deviation of blue green. If the value of the six features extracted query image on the index table image will display the video information system according to the query image. To query text and query images, the system will display the video information if the query text and query images have a relationship that is query text and query image has the same film title.   Keywords—  video, index, retrieval, text, image

  5. Web components and the semantic web

    OpenAIRE

    Casey, Maire; Pahl, Claus

    2003-01-01

    Component-based software engineering on the Web differs from traditional component and software engineering. We investigate Web component engineering activites that are crucial for the development,com position, and deployment of components on the Web. The current Web Services and Semantic Web initiatives strongly influence our work. Focussing on Web component composition we develop description and reasoning techniques that support a component developer in the composition activities,fo cussing...

  6. Optimization of Calcine Blending During Retrieval From Binsets; TOPICAL

    International Nuclear Information System (INIS)

    Taylor, D.D.; Mohr, C.M.; Nelson, L.O.

    2000-01-01

    This report documents a study performed during advanced feasibility studies for the INTEC Technology Development Facility (ITDF). The study was commissioned to provide information about functional requirements for the ITDF related to development of equipment and procedures for retrieving radioactive calcine from binset storage at the Idaho Nuclear Technology and Engineering Center (INTEC) at the Idaho National Engineering and Environmental Laboratory (INEEL). Calcine will be retrieved prior to treating it for permanent disposal in a national repository for high level waste. The objective this study was to estimate the degree of homogenization of the calcine that might be achieved through optimized retrieval and subsequent blending. Such homogenization has the potential of reducing the costs for treatment of the calcine and for qualifying of the final waste forms for acceptance at the repository. Results from the study indicate that optimized retrieval and blending can reduce the peak c oncentration variations of key components (Al, Zr, F) in blended batches of retrieved calcine. During un-optimized retrieval these variations are likely to be 81-138% while optimized retrieval can reduce them to the 5-10% range

  7. The Oklahoma Geographic Information Retrieval System

    Science.gov (United States)

    Blanchard, W. A.

    1982-01-01

    The Oklahoma Geographic Information Retrieval System (OGIRS) is a highly interactive data entry, storage, manipulation, and display software system for use with geographically referenced data. Although originally developed for a project concerned with coal strip mine reclamation, OGIRS is capable of handling any geographically referenced data for a variety of natural resource management applications. A special effort has been made to integrate remotely sensed data into the information system. The timeliness and synoptic coverage of satellite data are particularly useful attributes for inclusion into the geographic information system.

  8. PRISM, Processing and Review Interface for Strong Motion Data Software

    Science.gov (United States)

    Kalkan, E.; Jones, J. M.; Stephens, C. D.; Ng, P.

    2016-12-01

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the U.S., calls for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. PRISM automates the processing of strong-motion records by providing batch-processing capabilities. The PRISM software is platform-independent (coded in Java), open-source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a graphical user interface (GUI) for manual review and processing. To facilitate the use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and GUI components) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X and Windows. PRISM was designed to be flexible and extensible in order to accommodate implementation of new processing techniques. Input to PRISM currently is limited to data files in the Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) V0 format, so that all retrieved acceleration time series need to be converted to this format. Output products include COSMOS V1, V2 and V3 files as: (i) raw acceleration time series in physical units with mean removed (V1), (ii) baseline-corrected and filtered acceleration, velocity, and displacement time series (V2), and (iii) response spectra, Fourier amplitude spectra and common earthquake-engineering intensity measures (V3). A thorough description of the record processing features supported by PRISM is presented with examples and validation results. All computing features have been

  9. A software package for acquisition, accounting and statistical evaluation of on-line retrieval

    International Nuclear Information System (INIS)

    Helmreich, F.; Nevyjel, A.

    1981-03-01

    The described program system is used for the automatization of the administration in an information retrieval department. The data of the users and of every on line session are stored in two files and can be evaluated in different statistics. The data acquisition is done interactively, the statistic programs run as well in dialog and in batch. (author) [de

  10. Teaching Software Componentization: A Bar Chart Java Bean

    Science.gov (United States)

    Mitri, Michel

    2010-01-01

    In the current object-oriented paradigm, software construction increasingly involves creating and utilizing "software components". These components can serve a variety of functions, from common algorithmic processes to database connectivity to graphical interfaces. The advantage of component architectures is that programmers can use pre-existing…

  11. A simple procedure for retrieval of a cement-retained implant-supported crown: a case report.

    Science.gov (United States)

    Buzayan, Muaiyed Mahmoud; Mahmood, Wan Adida; Yunus, Norsiah Binti

    2014-02-01

    Retrieval of cement-retained implant prostheses can be more demanding than retrieval of screw-retained prostheses. This case report describes a simple and predictable procedure to locate the abutment screw access openings of cementretained implant-supported crowns in cases of fractured ceramic veneer. A conventional periapical radiography image was captured using a digital camera, transferred to a computer, and manipulated using Microsoft Word document software to estimate the location of the abutment screw access.

  12. A Software Development Platform for Mechatronic Systems

    DEFF Research Database (Denmark)

    Guan, Wei

    Software has become increasingly determinative for development of mechatronic systems, which underscores the importance of demands for shortened time-to-market, increased productivity, higher quality, and improved dependability. As the complexity of systems is dramatically increasing, these demands...... present a challenge to the practitioners who adopt conventional software development approach. An effective approach towards industrial production of software for mechatronic systems is needed. This approach requires a disciplined engineering process that encompasses model-driven engineering and component......-based software engineering, whereby we enable incremental software development using component models to address the essential design issues of real-time embedded systems. To this end, this dissertation presents a software development platform that provides an incremental model-driven development process based...

  13. GENII Version 2 Software Design Document

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  14. Independent component analysis for understanding multimedia content

    DEFF Research Database (Denmark)

    Kolenda, Thomas; Hansen, Lars Kai; Larsen, Jan

    2002-01-01

    Independent component analysis of combined text and image data from Web pages has potential for search and retrieval applications by providing more meaningful and context dependent content. It is demonstrated that ICA of combined text and image features has a synergistic effect, i.e., the retrieval...

  15. GCS component development cycle

    Science.gov (United States)

    Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti

    2012-09-01

    The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.

  16. Applying of component system development in object methodology

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available In the last three decades, the concept and implementation of component-based architectures have been promoted in software systems creation. Increasingly complex demands are placed on the software component systems, in particular relating to the dynamic properties. The emergence of such requirements has been gradually enforced by the practice of development and implementation of these systems, especially for information systems software.Just the information systems (robust IS of different types require that target software meets their requirements. Among other things, we mean primarily the adaptive processes of different domains, high distributives due to the possibilities of the Internet 2.0, acceptance of high integrity of life domains (process, data and communications integrity, scalability, and flexible adaptation to process changes, a good context for external devices and transparent structure of the sub-process modules and architectural units.Of course, the target software of required qualities and the type robust cannot be a monolith. As commonly known, development of design toward information systems software has clearly come to the need for the software composition of completely autonomous, but cooperating architectural units that communicate with each other using messages of prescribed formats.Although for such units there were often used the so called subsystems and modules, see (Jac, Boo, Rumbo, 1998 and (Arlo, Neus, 2007, their abstraction being gradually enacted as the term component. In other words, the subsystems and modules are specific types of components.In (Král, Žeml, 2000 and (Král, Žeml, 2003 there are considered two types of target software of information systems. The first type – there are SWC (Software Components, composed of permanently available components, which are thought as services – Confederate software. The second type – SWA (Software Alliance, called semi Confederate, formed during the run-time of the

  17. Guenter Tulip Filter Retrieval Experience: Predictors of Successful Retrieval

    International Nuclear Information System (INIS)

    Turba, Ulku Cenk; Arslan, Bulent; Meuse, Michael; Sabri, Saher; Macik, Barbara Gail; Hagspiel, Klaus D.; Matsumoto, Alan H.; Angle, John F.

    2010-01-01

    We report our experience with Guenter Tulip filter placement indications, retrievals, and procedural problems, with emphasis on alternative retrieval techniques. We have identified 92 consecutive patients in whom a Guenter Tulip filter was placed and filter removal attempted. We recorded patient demographic information, filter placement and retrieval indications, procedures, standard and nonstandard filter retrieval techniques, complications, and clinical outcomes. The mean time to retrieval for those who experienced filter strut penetration was statistically significant [F(1,90) = 8.55, p = 0.004]. Filter strut(s) IVC penetration and successful retrieval were found to be statistically significant (p = 0.043). The filter hook-IVC relationship correlated with successful retrieval. A modified guidewire loop technique was applied in 8 of 10 cases where the hook appeared to penetrate the IVC wall and could not be engaged with a loop snare catheter, providing additional technical success in 6 of 8 (75%). Therefore, the total filter retrieval success increased from 88 to 95%. In conclusion, the Guenter Tulip filter has high successful retrieval rates with low rates of complication. Additional maneuvers such as a guidewire loop method can be used to improve retrieval success rates when the filter hook is endothelialized.

  18. INTEGRATION OF SPATIAL INFORMATION WITH COLOR FOR CONTENT RETRIEVAL OF REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    Bikesh Kumar Singh

    2010-08-01

    Full Text Available There is rapid increase in image databases of remote sensing images due to image satellites with high resolution, commercial applications of remote sensing & high available bandwidth in last few years. The problem of content-based image retrieval (CBIR of remotely sensed images presents a major challenge not only because of the surprisingly increasing volume of images acquired from a wide range of sensors but also because of the complexity of images themselves. In this paper, a software system for content-based retrieval of remote sensing images using RGB and HSV color spaces is presented. Further, we also compare our results with spatiogram based content retrieval which integrates spatial information along with color histogram. Experimental results show that the integration of spatial information in color improves the image analysis of remote sensing data. In general, retrievals in HSV color space showed better performance than in RGB color space.

  19. The domain theory: patterns for knowledge and software reuse

    National Research Council Canada - National Science Library

    Sutcliffe, Alistair

    2002-01-01

    ..., retrieval system, or any other means, without prior written permission of the publisher. Lawrence Erlbaum Associates, Inc., Publishers 10 Industrial Avenue Mahwah, New Jersey 07430 Library of Congress Cataloging-in-Publication Data Sutcliffe, Alistair, 1951- The domain theory : patterns for knowledge and software reuse / Alistair Sutcl...

  20. The architecture of a reliable software monitoring system for embedded software systems

    International Nuclear Information System (INIS)

    Munson, J.; Krings, A.; Hiromoto, R.

    2006-01-01

    We develop the notion of a measurement-based methodology for embedded software systems to ensure properties of reliability, survivability and security, not only under benign faults but under malicious and hazardous conditions as well. The driving force is the need to develop a dynamic run-time monitoring system for use in these embedded mission critical systems. These systems must run reliably, must be secure and they must fail gracefully. That is, they must continue operating in the face of the departures from their nominal operating scenarios, the failure of one or more system components due to normal hardware and software faults, as well as malicious acts. To insure the integrity of embedded software systems, the activity of these systems must be monitored as they operate. For each of these systems, it is possible to establish a very succinct representation of nominal system activity. Furthermore, it is possible to detect departures from the nominal operating scenario in a timely fashion. Such departure may be due to various circumstances, e.g., an assault from an outside agent, thus forcing the system to operate in an off-nominal environment for which it was neither tested nor certified, or a hardware/software component that has ceased to operate in a nominal fashion. A well-designed system will have the property of graceful degradation. It must continue to run even though some of the functionality may have been lost. This involves the intelligent re-mapping of system functions. Those functions that are impacted by the failure of a system component must be identified and isolated. Thus, a system must be designed so that its basic operations may be re-mapped onto system components still operational. That is, the mission objectives of the software must be reassessed in terms of the current operational capabilities of the software system. By integrating the mechanisms to support observation and detection directly into the design methodology, we propose to shift

  1. On the choice of an optimal value-set of qualitative attributes for information retrieval in databases

    International Nuclear Information System (INIS)

    Ryjov, A.; Loginov, D.

    1994-01-01

    The problem of choosing an optimal set of significances of qualitative attributes for information retrieval in databases is addressed. Given a particular database, a set of significances is called optimal if it results in the minimization of losses of information and information noise for information retrieval in the data base. Obviously, such a set of significances depends on the statistical parameters of the data base. The software, which enables to calculate on the basis of the statistical parameters of the given data base, the losses of information and the information noise for arbitrary sets of significances of qualitative attributes, is described. The software also permits to compare various sets of significances of qualitative attributes and to choose the optimal set of significances

  2. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    Science.gov (United States)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  3. Foundations of Large-Scale Multimedia Information Management and Retrieval

    CERN Document Server

    Chang, Edward Y

    2011-01-01

    "Foundations of Large-Scale Multimedia Information Management and Retrieval - Mathematics of Perception" covers knowledge representation and semantic analysis of multimedia data and scalability in signal extraction, data mining, and indexing. The book is divided into two parts: Part I - Knowledge Representation and Semantic Analysis focuses on the key components of mathematics of perception as it applies to data management and retrieval. These include feature selection/reduction, knowledge representation, semantic analysis, distance function formulation for measuring similarity, and

  4. Die-cavity Design for High-Precision Nett-Forming of Engineering Component (PRECI4M)

    DEFF Research Database (Denmark)

    Ravn, Bjarne Gottlieb

    This report covers the 6-months period from month 37 to month 42 of the project. The following tasks are reported: • Task 14: Auto-interface between Data Retrieval and FE Analysis Software • Task 21: Tool Design Principles • Task 23: Specification of Tool Design System • Task 54: Development of S...... of Software Codes • Task 61: Design of Experiments for Validation • Task 62: Validation Experiments......This report covers the 6-months period from month 37 to month 42 of the project. The following tasks are reported: • Task 14: Auto-interface between Data Retrieval and FE Analysis Software • Task 21: Tool Design Principles • Task 23: Specification of Tool Design System • Task 54: Development...

  5. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  6. Natural language processing-based COTS software and related technologies survey.

    Energy Technology Data Exchange (ETDEWEB)

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  7. TRADEMARK IMAGE RETRIEVAL USING LOW LEVEL FEATURE EXTRACTION IN CBIR

    OpenAIRE

    Latika Pinjarkar*, Manisha Sharma, Smita Selot

    2016-01-01

    Trademarks work as significant responsibility in industry and commerce. Trademarks are important component of its industrial property, and violation can have severe penalty. Therefore designing an efficient trademark retrieval system and its assessment for uniqueness is thus becoming very important task now a days. Trademark image retrieval system where a new candidate trademark is compared with already registered trademarks to check that there is no possibility of resembl...

  8. Sensitivity Analysis for Atmospheric Infrared Sounder (AIRS) CO2 Retrieval

    Science.gov (United States)

    Gat, Ilana

    2012-01-01

    The Atmospheric Infrared Sounder (AIRS) is a thermal infrared sensor able to retrieve the daily atmospheric state globally for clear as well as partially cloudy field-of-views. The AIRS spectrometer has 2378 channels sensing from 15.4 micrometers to 3.7 micrometers, of which a small subset in the 15 micrometers region has been selected, to date, for CO2 retrieval. To improve upon the current retrieval method, we extended the retrieval calculations to include a prior estimate component and developed a channel ranking system to optimize the channels and number of channels used. The channel ranking system uses a mathematical formalism to rapidly process and assess the retrieval potential of large numbers of channels. Implementing this system, we identifed a larger optimized subset of AIRS channels that can decrease retrieval errors and minimize the overall sensitivity to other iridescent contributors, such as water vapor, ozone, and atmospheric temperature. This methodology selects channels globally by accounting for the latitudinal, longitudinal, and seasonal dependencies of the subset. The new methodology increases accuracy in AIRS CO2 as well as other retrievals and enables the extension of retrieved CO2 vertical profiles to altitudes ranging from the lower troposphere to upper stratosphere. The extended retrieval method for CO2 vertical profile estimation using a maximum-likelihood estimation method. We use model data to demonstrate the beneficial impact of the extended retrieval method using the new channel ranking system on CO2 retrieval.

  9. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  10. Peeling the Onion: Okapi System Architecture and Software Design Issues.

    Science.gov (United States)

    Jones, S.; And Others

    1997-01-01

    Discusses software design issues for Okapi, an information retrieval system that incorporates both search engine and user interface and supports weighted searching, relevance feedback, and query expansion. The basic search system, adjacency searching, and moving toward a distributed system are discussed. (Author/LRW)

  11. Digital Preservation in Open-Source Digital Library Software

    Science.gov (United States)

    Madalli, Devika P.; Barve, Sunita; Amin, Saiful

    2012-01-01

    Digital archives and digital library projects are being initiated all over the world for materials of different formats and domains. To organize, store, and retrieve digital content, many libraries as well as archiving centers are using either proprietary or open-source software. While it is accepted that print media can survive for centuries with…

  12. Testing, installation and development of hardware and software components for the forward pixel detector of CMS

    CERN Document Server

    Florez Bustos, Carlos Andres

    2007-01-01

    The LHC (Large Hadron Collider) will be the particle accelerator with the highest collision energy ever. CMS (Compact Muon Solenoid) is one of the two largest experiments at the LHC. A main goal of CMS is to elucidate the electroweak symmetry breaking and determine if the Higgs mechanism is responsible for it. The pixel detector in CMS is the closest detector to the interaction point and is part of the tracker system. This thesis presents four different projects related to the forward pixel detector, performed as part of the testing and development of its hardware and software components. It presents the methods, implementation and results for the data acquisition and installation of the detector control system at the Meson Test Beam Facility of Fermilab for the beam test of the detector; the study of the C.A.E.N power supply and the multi service cable; the layout of the test stands for the assembly of the half-disk and half-service cylinder and the development of a software interface to the data acquisition...

  13. Database Software Selection for the Egyptian National STI Network.

    Science.gov (United States)

    Slamecka, Vladimir

    The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…

  14. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... component of software development process. [25]. It is the ... technologies and stand the test of time. 2.0 Background of ..... costing and time span, and optimization of resource allocation have made long term estimation of ...

  15. Auditing the Functional Part of the CAS Software

    Directory of Open Access Journals (Sweden)

    Adamyk Oksana V.

    2017-11-01

    Full Text Available The article is aimed at determining the order and methodology of auditing the functional component of the software for an computer accounting system (CAS. It has been found that software auditing should be performed separately for each of its components. The components of the functional part of the CAS software are the database management system (DBMS and the application software supporting the accountance automation. For auditing of the first component part are used such techniques as general evaluation, subject check of the embedded algorithms of information processing. Auditing the client software algorithms is carried out by means of the control data method, which is reduced to such procedures as creation of another database of test data with imaginary objects and its processing by the client program, as well as introduction in a copy of the real database of imaginary objects (employees, creditors, material values and the formation of reporting. Not only the current methods of calculation or evaluation of accounting objects, but all of the software, are subject to mandatory verification. This will avoid errors if the enterprise accounting policy changes.

  16. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-10-01

    Full Text Available It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning of the software product allocated for the orthogonality.

  17. Software quality metrics aggregation in industry

    NARCIS (Netherlands)

    Mordal, K.; Anquetil, N.; Laval, J.; Serebrenik, A.; Vasilescu, B.N.; Ducasse, S.

    2013-01-01

    With the growing need for quality assessment of entire software systems in the industry, new issues are emerging. First, because most software quality metrics are defined at the level of individual software components, there is a need for aggregation methods to summarize the results at the system

  18. Retrievable Inferior Vena Cava Filters: Factors that Affect Retrieval Success

    Energy Technology Data Exchange (ETDEWEB)

    Geisbuesch, Philipp, E-mail: philippgeisbuesch@gmx.de; Benenati, James F.; Pena, Constantino S.; Couvillon, Joseph; Powell, Alex; Gandhi, Ripal; Samuels, Shaun; Uthoff, Heiko [Baptist Cardiac and Vascular Institute, Division of Vascular and Interventional Radiology (United States)

    2012-10-15

    Purpose: To report and analyze the indications, procedural success, and complications of retrievable inferior vena cava filters (rIVCF) placement and to identify parameters that influence retrieval attempt and failure. Methods: Between January 2005 and December 2010, a total of 200 patients (80 men, median age 67 years, range 11-95 years) received a rIVCF with the clinical possibility that it could be removed. All patients with rIVCF were prospectively entered into a database and followed until retrieval or a decision not to retrieve the filter was made. A retrospective analysis of this database was performed. Results: Sixty-one percent of patients had an accepted indication for filter placement; 39% of patients had a relative indication. There was a tendency toward a higher retrieval rate in patients with relative indications (40% vs. 55%, P = 0.076). Filter placement was technically successful in all patients, with no procedure-related mortality. The retrieval rate was 53%. Patient age of >80 years (odds ratio [OR] 0.056, P > 0.0001) and presence of malignancy (OR 0.303, P = 0.003) was associated with a significantly reduced probability for attempted retrieval. Retrieval failure occurred in 7% (6 of 91) of all retrieval attempts. A time interval of > 90 days between implantation and attempted retrieval was associated with retrieval failure (OR 19.8, P = 0.009). Conclusions: Patient age >80 years and a history of malignancy are predictors of a reduced probability for retrieval attempt. The rate of retrieval failure is low and seems to be associated with a time interval of >90 days between filter placement and retrieval.

  19. Retrievable Inferior Vena Cava Filters: Factors that Affect Retrieval Success

    International Nuclear Information System (INIS)

    Geisbüsch, Philipp; Benenati, James F.; Peña, Constantino S.; Couvillon, Joseph; Powell, Alex; Gandhi, Ripal; Samuels, Shaun; Uthoff, Heiko

    2012-01-01

    Purpose: To report and analyze the indications, procedural success, and complications of retrievable inferior vena cava filters (rIVCF) placement and to identify parameters that influence retrieval attempt and failure. Methods: Between January 2005 and December 2010, a total of 200 patients (80 men, median age 67 years, range 11–95 years) received a rIVCF with the clinical possibility that it could be removed. All patients with rIVCF were prospectively entered into a database and followed until retrieval or a decision not to retrieve the filter was made. A retrospective analysis of this database was performed. Results: Sixty-one percent of patients had an accepted indication for filter placement; 39% of patients had a relative indication. There was a tendency toward a higher retrieval rate in patients with relative indications (40% vs. 55%, P = 0.076). Filter placement was technically successful in all patients, with no procedure-related mortality. The retrieval rate was 53%. Patient age of >80 years (odds ratio [OR] 0.056, P > 0.0001) and presence of malignancy (OR 0.303, P = 0.003) was associated with a significantly reduced probability for attempted retrieval. Retrieval failure occurred in 7% (6 of 91) of all retrieval attempts. A time interval of > 90 days between implantation and attempted retrieval was associated with retrieval failure (OR 19.8, P = 0.009). Conclusions: Patient age >80 years and a history of malignancy are predictors of a reduced probability for retrieval attempt. The rate of retrieval failure is low and seems to be associated with a time interval of >90 days between filter placement and retrieval.

  20. Managing Scientific Software Complexity with Bocca and CCA

    Directory of Open Access Journals (Sweden)

    Benjamin A. Allan

    2008-01-01

    Full Text Available In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enable application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.

  1. An empirical study of software architectures' effect on product quality

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Jonasson, Kristjan; Neukirchen, Helmut

    2011-01-01

    Software architectures shift the focus of developers from lines-of-code to coarser-grained components and their interconnection structure. Unlike 2ne-grained objects, these components typically encompass business functionality and need to be aware of the underlying business processes. Hence......, the interface of a component should re4ect relevant parts of the business process and the software architecture should emphasize the coordination among components. To shed light on these issues, we provide a framework for component-based software architectures focusing on the process perspective. The interface...

  2. MERIS Retrieval of Water Quality Components in the Turbid Albemarle-Pamlico Sound Estuary, USA

    Directory of Open Access Journals (Sweden)

    Hans W. Paerl

    2011-04-01

    Full Text Available Two remote-sensing optical algorithms for the retrieval of the water quality components (WQCs in the Albemarle-Pamlico Estuarine System (APES were developed and validated for chlorophyll a (Chl. Both algorithms were semi-empirical because they incorporated some elements of optical processes in the atmosphere, water, and air/water interface. One incorporated a very simple atmospheric correction and modified quasi-single-scattering approximation (QSSA for estimating the spectral Gordon’s parameter, and the second estimated WQCs directly from the top of atmosphere satellite radiance without atmospheric corrections. A modified version of the Global Meteorological Database for Solar Energy and Applied Meteorology (METEONORM was used to estimate directional atmospheric transmittances. The study incorporated in situ Chl data from the Ferry-Based Monitoring (FerryMon program collected in the Neuse River Estuary (n = 633 and Pamlico Sound (n = 362, along with Medium Resolution Imaging Spectrometer (MERIS satellite imagery collected (2006–2009 across the APES; providing quasi-coinciding samples for Chl algorithm development and validation. Results indicated a coefficient of determination (R2 of 0.70 and mean-normalized root-mean-squares errors (NRMSE of 52% in the Neuse River Estuary and R2 = 0.44 (NRMSE = 75 % in the Pamlico Sound—without atmospheric corrections. The simple atmospheric correction tested provided on performance improvements. Algorithm performance demonstrated the potential for supporting long-term operational WQCs satellite monitoring in the APES.

  3. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  4. Impact of Base Functional Component Types on Software Functional Size based Effort Estimation

    OpenAIRE

    Gencel, Cigdem; Buglione, Luigi

    2008-01-01

    Software effort estimation is still a significant challenge for software management. Although Functional Size Measurement (FSM) methods have been standardized and have become widely used by the software organizations, the relationship between functional size and development effort still needs further investigation. Most of the studies focus on the project cost drivers and consider total software functional size as the primary input to estimation models. In this study, we investigate whether u...

  5. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  6. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  7. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  8. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  9. Innovative Research on the Development of Game-based Tourism Information Services Using Component-based Software Engineering

    Directory of Open Access Journals (Sweden)

    Wei-Hsin Huang

    2018-02-01

    Full Text Available In recent years, a number of studies have been conducted exploring the potential of digital tour guides, that is, multimedia components (e.g., 2D graphic, 3D models, and sound effects that can be integrated into digital storytelling with location-based services. This study uses component-based software engineering to develop the content of game-based tourism information services. The results of this study are combined with 3D VR/AR technology to implement the digital 2D/3D interactive tour guide and show all the attractions’ information on a service platform for the gamification of cultural tourism. Nine kinds of game templates have been built in the component module. Five locations have completed indoor or external 3D VR real scenes and provide online visitors with a virtual tour of the indoor or outdoor attractions. The AR interactive work has three logos. The interactive digital guide includes animated tour guides, interactive guided tours, directions and interactive guides. Based on the usage analysis of the component databases built by this study, VR game types are suited to object-oriented game templates, such as the puzzle game template and the treasure hunt game template. Background music is the database component required for each game. The icons and cue tones are the most commonly used components in 2D graphics and sound effects, but the icons are gathered in different directions to approximate the shape of the component to be consistent. This study built a vivid story of a scene tour for online visitors to enhance the interactive digital guide. However, the developer can rapidly build new digital guides by rearranging the components of the modules to shorten the development time by taking advantage of the usage frequency of various databases that have been built by this study to effectively continue to build and expand the database components. Therefore, more game-based digital tour guides can be created to make better defined high

  10. The software-cycle model for re-engineering and reuse

    Science.gov (United States)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  11. Effective Results Analysis for the Similar Software Products’ Orthogonality

    OpenAIRE

    Ion Ivan; Daniel Milodin

    2009-01-01

    It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning o...

  12. Retrievability, ethics and democracy

    International Nuclear Information System (INIS)

    Jensen, M.; Westerlind, M.

    2000-01-01

    Ethics is always a social concern, an integrated part of laws and regulations. Treatment of ethics as a separate part in the decision making process is therefore always debatable. It cannot be introduced as an extraneous component to compensate for, or to improve, a morally flawed practice, and the margin for unethical practices is strongly circumscribed by regulation in the nuclear field, internationally. However, a discussion on different stakeholders and their different ethical concerns should always be welcome. One example is the implementer's views on ethics. Even if they are in complete parity with existing legal and regulatory goals, the goals may still represent the implementer's own motives and choices. Also, stakeholders may view the laws or regulations as unfair. In making the critique, the stakeholder simply formulates a separate political standpoint. Finally, an alternative discussion is to place existing regulations into an ethical perspective - adding a new dimension to the issues. Retrievability for high level waste repositories is often in focus in ethical discussions. Unfortunately, it is used in many ways and has become an unclear term. It may cover anything from planned recuperation to the property of waste being retrievable in years or tens of years, or in the distant time range of hundreds or thousands of years. The term retrievability is often proposed to cover mainly positive qualities such as the option of later changes to the repository or a new disposal concept. However, as ICRP and others have pointed out, it also implies the possibility of: i) operational exposures, ii) continuing risks of accidental releases, iii) financial provisions to cover operating costs and iv) continuing reliance on institutional control, thus imposing some burdens to future generations. In a certain sense, anything can be retrieved from any repository. There is therefore a need for a clear and operable definition of retrievability requirements, including the

  13. A Pattern Language for the Evolution of Component-based Software Architectures

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Jamshidi, Pooyan; Pahl, Claus

    2013-01-01

    Modern software systems are prone to a continuous evolution under frequently varying requirements. Architecture-centric software evolution enables change in system’s structure and behavior while maintaining a global view of the software to address evolution-centric tradeoffs. Lehman’s law...... evolution problems. We propose that architectural evolution process requires an explicit evolution-centric knowledge – that can be discovered, shared, and reused – to anticipate and guide change management. Therefore, we present a pattern language as a collection of interconnected change patterns......) as a complementary and integrated phase to facilitate reuse-driven architecture change execution (pattern language application). Reuse-knowledge in the proposed pattern language is expressed as a formalised collection of interconnected-patterns. Individual patterns in the language build on each other to facilitate...

  14. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  15. cPath: open source software for collecting, storing, and querying biological pathways

    Directory of Open Access Journals (Sweden)

    Gross Benjamin E

    2006-11-01

    Full Text Available Abstract Background Biological pathways, including metabolic pathways, protein interaction networks, signal transduction pathways, and gene regulatory networks, are currently represented in over 220 diverse databases. These data are crucial for the study of specific biological processes, including human diseases. Standard exchange formats for pathway information, such as BioPAX, CellML, SBML and PSI-MI, enable convenient collection of this data for biological research, but mechanisms for common storage and communication are required. Results We have developed cPath, an open source database and web application for collecting, storing, and querying biological pathway data. cPath makes it easy to aggregate custom pathway data sets available in standard exchange formats from multiple databases, present pathway data to biologists via a customizable web interface, and export pathway data via a web service to third-party software, such as Cytoscape, for visualization and analysis. cPath is software only, and does not include new pathway information. Key features include: a built-in identifier mapping service for linking identical interactors and linking to external resources; built-in support for PSI-MI and BioPAX standard pathway exchange formats; a web service interface for searching and retrieving pathway data sets; and thorough documentation. The cPath software is freely available under the LGPL open source license for academic and commercial use. Conclusion cPath is a robust, scalable, modular, professional-grade software platform for collecting, storing, and querying biological pathways. It can serve as the core data handling component in information systems for pathway visualization, analysis and modeling.

  16. Managing Written Directives: A Software Solution to Streamline Workflow.

    Science.gov (United States)

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases

  17. Frontiers in NDE research nearing maturity for exploitation to ensure structural integrity of pressure retaining components

    International Nuclear Information System (INIS)

    Raj, Baldev; Mukhopadhyay, C.K.; Jayakumar, T.

    2006-01-01

    In this paper, research and developmental efforts that demonstrate high sensitivity detection and characterization of defects and assessment of microstructural degradation, residual stresses and fatigue damage in materials using different non-destructive evaluation (NDE) techniques, have been discussed. Applications of eddy current techniques for quantitative defect characterization and for generalized applications, and remote field eddy current technique for inspection of steam generator and heat exchanger tubes have been discussed. Advanced ultrasonic methods such as time of flight diffraction, synthetic aperture focusing technique, phased array and signal processing for detection, characterization and imaging of defects have been discussed. Applications of ultrasonics and magnetic Barkhausen emission techniques for characterization of microstructures and residual stresses have been discussed. Applications of acoustic emission and infrared thermography techniques for weld quality evaluation of critical nuclear components as part of intelligent processing of materials (IPM) work have been discussed. Application of acoustic emission technique for integrity assessment of pressurized components has been discussed. Development of a software called assets and infrastructure management system (AIMS), for storing and retrieving information for various materials, components and systems, has also been highlighted. The techniques and applications discussed are result of systematic and innovative R and D efforts in the multidisciplinary areas of physics, materials, instrumentation, sensors and softwares for providing solutions to various challenging problems

  18. Two retrievals from a single cue: A bottleneck persists across episodic and semantic memory.

    Science.gov (United States)

    Orscheschek, Franziska; Strobach, Tilo; Schubert, Torsten; Rickard, Timothy

    2018-05-01

    There is evidence in the literature that two retrievals from long-term memory cannot occur in parallel. To date, however, that work has explored only the case of two retrievals from newly acquired episodic memory. These studies demonstrated a retrieval bottleneck even after dual-retrieval practice. That retrieval bottleneck may be a global property of long-term memory retrieval, or it may apply only to the case of two retrievals from episodic memory. In the current experiments, we explored whether that apparent dual-retrieval bottleneck applies to the case of one retrieval from episodic memory and one retrieval from highly overlearned semantic memory. Across three experiments, subjects learned to retrieve a left or right keypress response form a set of 14 unique word cues (e.g., black-right keypress). In addition, they learned a verbal response which involved retrieving the antonym of the presented cue (e.g., black-"white"). In the dual-retrieval condition, subjects had to retrieve both the keypress response and the antonym word. The results suggest that the retrieval bottleneck is superordinate to specific long-term memory systems and holds across different memory components. In addition, the results support the assumption of a cue-level response chunking account of learned retrieval parallelism.

  19. Selective memory retrieval can impair and improve retrieval of other memories.

    Science.gov (United States)

    Bäuml, Karl-Heinz T; Samenieh, Anuscheh

    2012-03-01

    Research from the past decades has shown that retrieval of a specific memory (e.g., retrieving part of a previous vacation) typically attenuates retrieval of other memories (e.g., memories for other details of the event), causing retrieval-induced forgetting. More recently, however, it has been shown that retrieval can both attenuate and aid recall of other memories (K.-H. T. Bäuml & A. Samenieh, 2010). To identify the circumstances under which retrieval aids recall, the authors examined retrieval dynamics in listwise directed forgetting, context-dependent forgetting, proactive interference, and in the absence of any induced memory impairment. They found beneficial effects of selective retrieval in listwise directed forgetting and context-dependent forgetting but detrimental effects in all the other conditions. Because context-dependent forgetting and listwise directed forgetting arguably reflect impaired context access, the results suggest that memory retrieval aids recall of memories that are subject to impaired context access but attenuates recall in the absence of such circumstances. The findings are consistent with a 2-factor account of memory retrieval and suggest the existence of 2 faces of memory retrieval. 2012 APA, all rights reserved

  20. Qualification of a Null Lens Using Image-Based Phase Retrieval

    Science.gov (United States)

    Bolcar, Matthew R.; Aronstein, David L.; Hill, Peter C.; Smith, J. Scott; Zielinski, Thomas P.

    2012-01-01

    In measuring the figure error of an aspheric optic using a null lens, the wavefront contribution from the null lens must be independently and accurately characterized in order to isolate the optical performance of the aspheric optic alone. Various techniques can be used to characterize such a null lens, including interferometry, profilometry and image-based methods. Only image-based methods, such as phase retrieval, can measure the null-lens wavefront in situ - in single-pass, and at the same conjugates and in the same alignment state in which the null lens will ultimately be used - with no additional optical components. Due to the intended purpose of a Dull lens (e.g., to null a large aspheric wavefront with a near-equal-but-opposite spherical wavefront), characterizing a null-lens wavefront presents several challenges to image-based phase retrieval: Large wavefront slopes and high-dynamic-range data decrease the capture range of phase-retrieval algorithms, increase the requirements on the fidelity of the forward model of the optical system, and make it difficult to extract diagnostic information (e.g., the system F/#) from the image data. In this paper, we present a study of these effects on phase-retrieval algorithms in the context of a null lens used in component development for the Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission. Approaches for mitigation are also discussed.

  1. Spectrum-based Fault Localization in Embedded Software

    NARCIS (Netherlands)

    Abreu, R.

    2009-01-01

    Locating software components that are responsible for observed failures is a time-intensive and expensive phase in the software development cycle. Automatic fault localization techniques aid developers/testers in pinpointing the root cause of software failures, as such reducing the debugging effort.

  2. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  3. Foreign Body Retrieval

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Foreign Body Retrieval Foreign body retrieval is the removal of ... foreign body detection and removal? What is Foreign Body Retrieval? Foreign body retrieval involves the removal of ...

  4. The role of retrieval mode and retrieval orientation in retrieval practice: insights from comparing recognition memory testing formats and restudying.

    Science.gov (United States)

    Gao, Chuanji; Rosburg, Timm; Hou, Mingzhu; Li, Bingbing; Xiao, Xin; Guo, Chunyan

    2016-12-01

    The effectiveness of retrieval practice for aiding long-term memory, referred to as the testing effect, has been widely demonstrated. However, the specific neurocognitive mechanisms underlying this phenomenon remain unclear. In the present study, we sought to explore the role of pre-retrieval processes at initial testing on later recognition performance by using event-related potentials (ERPs). Subjects studied two lists of words (Chinese characters) and then performed a recognition task or a source memory task, or restudied the word lists. At the end of the experiment, subjects received a final recognition test based on the remember-know paradigm. Behaviorally, initial testing (active retrieval) enhanced memory retention relative to restudying (passive retrieval). The retrieval mode at initial testing was indexed by more positive-going ERPs for unstudied items in the active-retrieval tasks than in passive retrieval from 300 to 900 ms. Follow-up analyses showed that the magnitude of the early ERP retrieval mode effect (300-500 ms) was predictive of the behavioral testing effect later on. In addition, the ERPs for correctly rejected new items during initial testing differed between the two active-retrieval tasks from 500 to 900 ms, and this ERP retrieval orientation effect predicted differential behavioral testing gains between the two active-retrieval conditions. Our findings confirm that initial testing promotes later retrieval relative to restudying, and they further suggest that adopting pre-retrieval processing in the forms of retrieval mode and retrieval orientation might contribute to these memory enhancements.

  5. The Qualification Experiences for Safety-critical Software of POSAFE-Q

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Son, Kwang Seop; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2009-05-15

    Programmable Logic Controllers (PLC) have been applied to the Reactor Protection System (RPS) and the Engineered Safety Feature (ESF)-Component Control System (CCS) as the major safety system components of nuclear power plants. This paper describes experiences on the qualification of the safety-critical software including the pCOS kernel and system tasks related to a safety-grade PLC, i.e. the works done for the Software Verification and Validation, Software Safety Analysis, Software Quality Assurance, and Software Configuration Management etc.

  6. Encoding and Retrieval Interference in Sentence Comprehension: Evidence from Agreement

    Directory of Open Access Journals (Sweden)

    Sandra Villata

    2018-01-01

    Full Text Available Long-distance verb-argument dependencies generally require the integration of a fronted argument when the verb is encountered for sentence interpretation. Under a parsing model that handles long-distance dependencies through a cue-based retrieval mechanism, retrieval is hampered when retrieval cues also resonate with non-target elements (retrieval interference. However, similarity-based interference may also stem from interference arising during the encoding of elements in memory (encoding interference, an effect that is not directly accountable for by a cue-based retrieval mechanism. Although encoding and retrieval interference are clearly distinct at the theoretical level, it is difficult to disentangle the two on empirical grounds, since encoding interference may also manifest at the retrieval region. We report two self-paced reading experiments aimed at teasing apart the role of each component in gender and number subject-verb agreement in Italian and English object relative clauses. In Italian, the verb does not agree in gender with the subject, thus providing no cue for retrieval. In English, although present tense verbs agree in number with the subject, past tense verbs do not, allowing us to test the role of number as a retrieval cue within the same language. Results from both experiments converge, showing similarity-based interference at encoding, and some evidence for an effect at retrieval. After having pointed out the non-negligible role of encoding in sentence comprehension, and noting that Lewis and Vasishth’s (2005 ACT-R model of sentence processing, the most fully developed cue-based retrieval approach to sentence processing does not predict encoding effects, we propose an augmentation of this model that predicts these effects. We then also propose a self-organizing sentence processing model (SOSP, which has the advantage of accounting for retrieval and encoding interference with a single mechanism.

  7. Encoding and Retrieval Interference in Sentence Comprehension: Evidence from Agreement

    Science.gov (United States)

    Villata, Sandra; Tabor, Whitney; Franck, Julie

    2018-01-01

    Long-distance verb-argument dependencies generally require the integration of a fronted argument when the verb is encountered for sentence interpretation. Under a parsing model that handles long-distance dependencies through a cue-based retrieval mechanism, retrieval is hampered when retrieval cues also resonate with non-target elements (retrieval interference). However, similarity-based interference may also stem from interference arising during the encoding of elements in memory (encoding interference), an effect that is not directly accountable for by a cue-based retrieval mechanism. Although encoding and retrieval interference are clearly distinct at the theoretical level, it is difficult to disentangle the two on empirical grounds, since encoding interference may also manifest at the retrieval region. We report two self-paced reading experiments aimed at teasing apart the role of each component in gender and number subject-verb agreement in Italian and English object relative clauses. In Italian, the verb does not agree in gender with the subject, thus providing no cue for retrieval. In English, although present tense verbs agree in number with the subject, past tense verbs do not, allowing us to test the role of number as a retrieval cue within the same language. Results from both experiments converge, showing similarity-based interference at encoding, and some evidence for an effect at retrieval. After having pointed out the non-negligible role of encoding in sentence comprehension, and noting that Lewis and Vasishth’s (2005) ACT-R model of sentence processing, the most fully developed cue-based retrieval approach to sentence processing does not predict encoding effects, we propose an augmentation of this model that predicts these effects. We then also propose a self-organizing sentence processing model (SOSP), which has the advantage of accounting for retrieval and encoding interference with a single mechanism. PMID:29403414

  8. Effective Software Engineering Leadership for Development Programs

    Science.gov (United States)

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  9. Physics Validation of the LHC Software

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The LHC Software will be confronted to unprecedented challenges as soon as the LHC will turn on. We summarize the main Software requirements coming from the LHC detectors, triggers and physics, and we discuss several examples of Software components developed by the experiments and the LCG project (simulation, reconstruction, etc.), their validation, and their adequacy for LHC physics.

  10. Asset management: integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-06-01

    Two new multi-dimensional databases, which expand the `row and column` concept of spreadsheets into multiple categories of data called dimensions, are described. These integrated software packages provide the foundation for industry players such as Poco Petroleum Ltd and Numac Energy Inc to gain a competitive advantage, by overhauling their respective data collection and retrieval systems to allow for timely cost analysis and financial reporting. Energy Warehouse, an on-line analytical processing product marketed by SysGold Ltd, is one of the software products described. It gathers various sources of information, allows advanced searches and generates reports previously unavailable in other conventional financial accounting systems. The second product discussed - the Canadian Upstream Energy System (CUES) - is an on-line analytical processing system developed by Oracle Corporation and Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server and software development tools with ATS`s upstream financial, land, geotechnical and production applications. The software also allows for optimization of facilities, analysis of production efficiencies and comparison of performance against industry standards.

  11. Effective material parameter retrieval of anisotropic elastic metamaterials with inherent nonlocality

    Science.gov (United States)

    Lee, Hyung Jin; Lee, Heung Son; Ma, Pyung Sik; Kim, Yoon Young

    2016-09-01

    In this paper, the scattering (S-) parameter retrieval method is presented specifically for anisotropic elastic metamaterials; so far, no retrieval has been accomplished when elastic metamaterials exhibit fully anisotropic behavior. Complex constitutive property and intrinsic scattering behavior of elastic metamaterials make their characterization far more complicated than that for acoustic and electromagnetic metamaterials. In particular, elastic metamaterials generally exhibit anisotropic scattering behavior due to higher scattering modes associated with shear deformation. They also exhibit nonlocal responses to some degrees, which originate from strong multiple scattering interactions even in the long wavelength limit. Accordingly, the conventional S-parameter retrieval methods cannot be directly used for elastic metamaterials, because they determine only the diagonal components in effective tensor property. Also, the conventional methods simply use the analytic inversion formulae for the material characterization so that inherent nonlocality cannot be taken into account. To establish a retrieval method applicable to anisotropic elastic metamaterials, we propose an alternative S-parameter method to deal with full anisotropy of elastic metamaterials. To retrieve the whole effective anisotropic parameter, we utilize not only normal but also oblique wave incidences. For the retrieval, we first retrieve the ratio of the effective stiffness tensor to effective density and then determine the effective density. The proposed retrieval method is validated by characterizing the effective material parameters of various types of non-resonant anisotropic metamaterials. It is found that the whole effective parameters are retrieved consistently regardless of used retrieval conditions in spite of inherent nonlocality.

  12. Performance evaluation software moving object detection and tracking in videos

    CERN Document Server

    Karasulu, Bahadir

    2013-01-01

    Performance Evaluation Software: Moving Object Detection and Tracking in Videos introduces a software approach for the real-time evaluation and performance comparison of the methods specializing in moving object detection and/or tracking (D&T) in video processing. Digital video content analysis is an important item for multimedia content-based indexing (MCBI), content-based video retrieval (CBVR) and visual surveillance systems. There are some frequently-used generic algorithms for video object D&T in the literature, such as Background Subtraction (BS), Continuously Adaptive Mean-shift (CMS),

  13. SOFA 2 Component Framework and Its Ecosystem

    Czech Academy of Sciences Publication Activity Database

    Malohlava, M.; Hnětynka, P.; Bureš, Tomáš

    2013-01-01

    Roč. 295, 9 May (2013), s. 101-106 ISSN 1571-0661. [FESCA 2012. International Workshop on Formal Engineering approaches to Software Components and Architectures /9./. Tallinn, 31.03.2012] R&D Projects: GA ČR GD201/09/H057 Grant - others:GA AV ČR(CZ) GAP202/11/0312; UK(CZ) SVV-2012-265312 Keywords : CBSE * component system * component model * component * sofa * ecosystem * development tool Subject RIV: JC - Computer Hardware ; Software

  14. Software reliability for safety-critical applications

    International Nuclear Information System (INIS)

    Everett, B.; Musa, J.

    1994-01-01

    In this talk, the authors address the question open-quotes Can Software Reliability Engineering measurement and modeling techniques be applied to safety-critical applications?close quotes Quantitative techniques have long been applied in engineering hardware components of safety-critical applications. The authors have seen a growing acceptance and use of quantitative techniques in engineering software systems but a continuing reluctance in using such techniques in safety-critical applications. The general case posed against using quantitative techniques for software components runs along the following lines: safety-critical applications should be engineered such that catastrophic failures occur less frequently than one in a billion hours of operation; current software measurement/modeling techniques rely on using failure history data collected during testing; one would have to accumulate over a billion operational hours to verify failure rate objectives of about one per billion hours

  15. Open software architecture for east articulated maintenance arm

    International Nuclear Information System (INIS)

    Wu, Jing; Wu, Huapeng; Song, Yuntao; Li, Ming; Yang, Yang; Alcina, Daniel A.M.

    2016-01-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  16. Open software architecture for east articulated maintenance arm

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jing, E-mail: wujing@ipp.ac.cn [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Wu, Huapeng [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Song, Yuntao [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Li, Ming [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Yang, Yang [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Alcina, Daniel A.M. [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland)

    2016-11-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  17. Retrieving background surface reflectance of Himawari-8/AHI using BRDF modeling

    Science.gov (United States)

    Choi, Sungwon; Seo, Minji; Lee, Kyeong-sang; Han, Kyung-soo

    2017-04-01

    In these days, remote sensing is more important than past. And retrieving surface reflectance in remote sensing is also important. So there are many ways to retrieve surface reflectance by my countries with polar orbit and geostationary satellite. We studied Bidirectional Reflectance Distribution Function (BRDF) which is used to retrieve surface reflectance. In BRDF equation, we calculate surface reflectance using BRD components and angular data. BRD components are to calculate 3 of scatterings, isotropic geometric and volumetric scattering. To make Background Surface Reflectance (BSR) of Himawari-8/AHI. We used 5 bands (band1, band2, band3, band4, band5) with BRDF. And we made 5 BSR for 5 channels. For validation, we compare BSR with Top of canopy (TOC) reflectance of AHI. As a result, bias are from -0.00223 to 0.008328 and Root Mean Square Error (RMSE) are from 0.045 to 0.049. We think BSR can be used to replace TOC reflectance in remote sensing to improve weakness of TOC reflectance.

  18. Retrieval of fluidizable radioactive wastes from storage facilities

    International Nuclear Information System (INIS)

    2006-08-01

    This report provides guidance for strategic planning and implementation of resuspension and retrieval of stored fluid or fluidizable radioactive wastes. The potential risks associated with preparation and realization of these processes are included in the report, and lessons learned from previous applications are highlighted. Technological procedures and equipment used in various countries for resuspension and remobilization of stored fluidizable radioactive wastes are described in the attached annexes as potential options. Waste retrieval is a maturing technology of major importance now that Member States are moving forward in the responsible management of wastes by removal to safe interim storage or disposal. Retrieval of fluidizable wastes is a four-phase operation: (1) access to the waste, (2) mobilize the waste, (3) remove the waste; and (4) transfer the waste.This report divides successful retrieval of radioactive waste into two areas. The first area applies the concept of the waste retrieval as being the final component of a systematic process of old waste management. It also encompasses characterization as it applies to waste retrieval and downstream processes, including acceptance of wastes for treatment, conditioning, storage or disposal. It should be in conformity with national policy, as well as complying with international safety standards and environmental agreements. The second area of the report focuses on implementation of waste retrieval in a wide range of scenarios and using a wide range of retrieval approaches, equipment and technologies. Technical processes are further explained as part of the experience gained in advanced countries on the subject. A set of detailed retrieval technology descriptions by country is included as Annexes to this report. Thirteen experts from seven Member States that previously implemented, or have planned for the near future, significant resuspension and remobilization operations were involved in the preparation of

  19. View subspaces for indexing and retrieval of 3D models

    Science.gov (United States)

    Dutagaci, Helin; Godil, Afzal; Sankur, Bülent; Yemez, Yücel

    2010-02-01

    View-based indexing schemes for 3D object retrieval are gaining popularity since they provide good retrieval results. These schemes are coherent with the theory that humans recognize objects based on their 2D appearances. The viewbased techniques also allow users to search with various queries such as binary images, range images and even 2D sketches. The previous view-based techniques use classical 2D shape descriptors such as Fourier invariants, Zernike moments, Scale Invariant Feature Transform-based local features and 2D Digital Fourier Transform coefficients. These methods describe each object independent of others. In this work, we explore data driven subspace models, such as Principal Component Analysis, Independent Component Analysis and Nonnegative Matrix Factorization to describe the shape information of the views. We treat the depth images obtained from various points of the view sphere as 2D intensity images and train a subspace to extract the inherent structure of the views within a database. We also show the benefit of categorizing shapes according to their eigenvalue spread. Both the shape categorization and data-driven feature set conjectures are tested on the PSB database and compared with the competitor view-based 3D shape retrieval algorithms.

  20. The Five 'R's' for Developing Trusted Software Frameworks to increase confidence in, and maximise reuse of, Open Source Software.

    Science.gov (United States)

    Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens

    2015-04-01

    Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of

  1. Satellite retrieval of actual evapotranspiration in the Tibetan Plateau: Components partitioning, multidecadal trends and dominated factors identifying

    Science.gov (United States)

    Wang, Weiguang; Li, Jinxing; Yu, Zhongbo; Ding, Yimin; Xing, Wanqiu; Lu, Wenjun

    2018-04-01

    As the only connecting term between water balance and energy budget in the earth-atmospheric system, evapotranspiration (ET) is considered the most excellent indicator for the activity for the water and energy cycle. Under the background of global change, regional ET estimates, components partitioning as well as their spatial and temporal patterns recognition are of great importance in understanding the hydrological processes and improving water management practices. This is particularly true for the Tibetan Plateau (TP), one of most sensitive and vulnerable region in response to the environment change in the earth. In this study, with flux site observation data and monthly ET data from the monthly water balance method incorporating the terrestrial water storage changes from the Gravity Recovery and Climate Experiment satellite (GRACE) production as the multiple validations, the long-term daily ET in the TP was retrieved by a modified Penman-Monteith-Leuning (PML) model with considering evapotranspiration over snow covered area during 1982-2012. The spatial and temporal changes of partitioned three components of ET, i.e., soil evaporation (Es), transpiration through the stomata of plant (Ec) and canopy interception (Ei), were investigated in the TP. Meanwhile, how the ET components contribute to ET changes and respond to the change in environmental factors in the TP was revealed and discussed. The results indicate that Es dominates ET in most areas of the TP with the mean annual ratio of 65.7%, except southeastern regions where the vegetation coverage is high. Although regional average ET and three main components all present obvious increase trends during the past decades, high spatial heterogeneity for their trends are identified in the TP. Moreover, a mixed changing pattern can be apparently found for Es in southeastern area, Ec and Ei in northwestern and southeastern area. Spatially, the ET variation are mainly attributed to change in Es, followed by Ec and Ei

  2. Hanford tanks initiative - test implementation plan for demonstration of in-tank retrieval technology

    International Nuclear Information System (INIS)

    Schaus, P.S.

    1997-01-01

    This document presents a Systems Engineering approach for performing the series of tests associated with demonstrating in-tank retrieval technologies. The testing ranges from cold testing of individual components at the vendor's facility to the final fully integrated demonstration of the retrieval system's ability to remove hard heel high-level waste from the bottom of a Hanford single-shell tank

  3. HASILT: An intelligent software platform for HAZOP, LOPA, SRS and SIL verification

    International Nuclear Information System (INIS)

    Cui, Lin; Shu, Yidan; Wang, Zhaohui; Zhao, Jinsong; Qiu, Tong; Sun, Wenyong; Wei, Zhenqiang

    2012-01-01

    Incomplete process hazard analysis (PHA) and poor knowledge management have been two major reasons that have caused numerous lamentable disasters in the chemical process industry (CPI). To improve PHA quality, a new integration framework that combines HAZOP, layer of protection analysis (LOPA), safety requirements specification (SRS) and safety integrity level (SIL) validation is proposed in this paper. To facilitate the integrated work flow and improve the relevant knowledge management, an intelligent software platform named HASILT has been developed by our research team. Its key components and functions are described in this paper. Furthermore, since the platform keeps all history data in a central case base and case-based reasoning is used to automatically retrieve similar old cases for helping resolve new problems, a recall opportunity is created to reduce information loss which has been cited many times as a common root cause in investigations of accidents.

  4. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Sparkman, D R; Lagdon, R

    2004-01-01

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (sm b ullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (sm b ullet) Considers the larger system that uses the software and its impacts (sm b ullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  5. Unified approach for retrieval of effective parameters of metamaterials

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Ha, Sangwoo; Sukhorukov, Andrey A.

    2011-01-01

    that our method is able to retrieve both material and wave EPs for a wide range of materials, which can be lossy or lossless, dispersive, possess negative permittivity, permeability and refractive index values. It is simple and unambiguous, free of the "branch" problem, which is an issue for the reflection....../transmission based method and has no limitations on a metamaterial slab thickness. The method does not require averaging different fields' components at various surfaces or contours. The retrieval of both wave and material EPs is performed within a single computational cycle, after exporting fields on the unit cells...

  6. Brain mechanisms of successful recognition through retrieval of semantic context.

    Science.gov (United States)

    Flegal, Kristin E; Marín-Gutiérrez, Alejandro; Ragland, J Daniel; Ranganath, Charan

    2014-08-01

    Episodic memory is associated with the encoding and retrieval of context information and with a subjective sense of reexperiencing past events. The neural correlates of episodic retrieval have been extensively studied using fMRI, leading to the identification of a "general recollection network" including medial temporal, parietal, and prefrontal regions. However, in these studies, it is difficult to disentangle the effects of context retrieval from recollection. In this study, we used fMRI to determine the extent to which the recruitment of regions in the recollection network is contingent on context reinstatement. Participants were scanned during a cued recognition test for target words from encoded sentences. Studied target words were preceded by either a cue word studied in the same sentence (thus congruent with encoding context) or a cue word studied in a different sentence (thus incongruent with encoding context). Converging fMRI results from independently defined ROIs and whole-brain analysis showed regional specificity in the recollection network. Activity in hippocampus and parahippocampal cortex was specifically increased during successful retrieval following congruent context cues, whereas parietal and prefrontal components of the general recollection network were associated with confident retrieval irrespective of contextual congruency. Our findings implicate medial temporal regions in the retrieval of semantic context, contributing to, but dissociable from, recollective experience.

  7. Educational Technologies Based on Software Components

    Directory of Open Access Journals (Sweden)

    Marian DARDALA

    2006-01-01

    Full Text Available Informatics technologies allow to easily develop and adapt e-learning systems. In order to be used by any users, the system must be developed to permit the lessons construction in a dynamic way. The component technology is a solution to this problem and offers the possibility to define the basic objects that will be connected at the run time to develop the personalized e-lessons.

  8. Phase retrieval from a single fringe pattern by using empirical wavelet transform

    International Nuclear Information System (INIS)

    Guo, Xiaopeng; Zhao, Hong; Wang, Xin

    2015-01-01

    Phase retrieval from a single fringe pattern is one of the key tasks in optical metrology. In this paper, we present a new method for phase retrieval from a single fringe pattern based on empirical wavelet transform. In the proposed method, a fringe pattern can be effectively divided into three components: nonuniform background, fringes and random noise, which are described in different sub-pass. So the phase distribution information can be robustly extracted from fringes representing a fundamental frequency component. In simulation and a practical projection fringes test, the performance of the present method is successfully verified by comparing with the conventional wavelet transform method in terms of both image quality and phase estimation errors. (paper)

  9. An Intelligent Information Retrieval Approach Based on Two Degrees of Uncertainty Fuzzy Ontology

    Directory of Open Access Journals (Sweden)

    Maryam Hourali

    2011-01-01

    Full Text Available In spite of the voluminous studies in the field of intelligent retrieval systems, effective retrieving of information has been remained an important unsolved problem. Implementations of different conceptual knowledge in the information retrieval process such as ontology have been considered as a solution to enhance the quality of results. Furthermore, the conceptual formalism supported by typical ontology may not be sufficient to represent uncertainty information due to the lack of clear-cut boundaries between concepts of the domains. To tackle this type of problems, one possible solution is to insert fuzzy logic into ontology construction process. In this article, a novel approach for fuzzy ontology generation with two uncertainty degrees is proposed. Hence, by implementing linguistic variables, uncertainty level in domain's concepts (Software Maintenance Engineering (SME domain has been modeled, and ontology relations have been modeled by fuzzy theory consequently. Then, we combined these uncertain models and proposed a new ontology with two degrees of uncertainty both in concept expression and relation expression. The generated fuzzy ontology was implemented for expansion of initial user's queries in SME domain. Experimental results showed that the proposed model has better overall retrieval performance comparing to keyword-based or crisp ontology-based retrieval systems.

  10. Performance Verification of GOSAT-2 FTS-2 Simulator and Sensitivity Analysis for Greenhouse Gases Retrieval

    Science.gov (United States)

    Kamei, A.; Yoshida, Y.; Dupuy, E.; Hiraki, K.; Matsunaga, T.

    2015-12-01

    The GOSAT-2, which is scheduled for launch in early 2018, is the successor mission to the Greenhouse gases Observing Satellite (GOSAT). The FTS-2 onboard the GOSAT-2 is a Fourier transform spectrometer, which has three bands in the near to short-wavelength infrared (SWIR) region and two bands in the thermal infrared (TIR) region to observe infrared light reflected and emitted from the Earth's surface and atmosphere with high-resolution spectra. Column amounts and vertical profiles of major greenhouse gases such as carbon dioxide (CO2) and methane (CH4) are retrieved from acquired radiance spectra. In addition, the FTS-2 has several improvements from the FTS onboard the GOSAT: 1) added spectral coverage in the SWIR region for carbon monoxide (CO) retrieval, 2) increased signal-to-noise ratio (SNR) for all bands, 3) extended range of along-track pointing angles for sunglint observations, 4) intelligent pointing to avoid cloud contamination. Since 2012, we have been developing a software tool, which is called the GOSAT-2 FTS-2 simulator, to simulate spectral radiance data that will be acquired by the GOSAT-2 FTS-2. The objective of it is to analyze/optimize data with respect to the sensor specification, the parameters for Level 1 processing, and the improvement of Level 2 retrieval algorithms. It consists of six components: 1) overall control, 2) sensor carrying platform, 3) spectral radiance calculation, 4) Fourier transform module, 5) Level 1B (L1B) processing, and 6) L1B data output. More realistic and faster simulations have been made possible by the improvement of details about sensor characteristics, the sophistication of data processing and algorithms, the addition of various observation modes, the use of surface and atmospheric ancillary data, and the speed-up and parallelization of radiative transfer code. This simulator is confirmed to be working properly from the reproduction of GOSAT FTS L1B data depends on the ancillary data. We will summarize the

  11. Retrieve of atmospheric SO2 and O3 columns in the UV region using mobile DOAS

    International Nuclear Information System (INIS)

    Galicia, R.; La Rosa, J. de la; Stolik, S.

    2012-01-01

    We present the use of a passive DOAS system to retrieve SO2 and O3 columns emitted by industrial chimneys. It works with software built in LabVIEW and running with a PC linked to mini spectrometer and GPS. The system uses the sun light as light source, a telescope a fiber optic, a mini-spectrometer and a GPS. The spectrometer and the GPS are linked to a PC where the system is controlled and where all data are processed to retrieve the SO2 and O3 slant columns. (Author)

  12. Retrieval-Based Learning: Positive Effects of Retrieval Practice in Elementary School Children

    Directory of Open Access Journals (Sweden)

    Jeffrey D. Karpicke

    2016-03-01

    Full Text Available A wealth of research has demonstrated that practicing retrieval is a powerful way to enhance learning. However, nearly all prior research has examined retrieval practice with college students. Little is known about retrieval practice in children, and even less is known about possible individual differences in retrieval practice. In three experiments, 88 children (mean age 10 years studied a list of words and either restudied the items or practiced retrieving them. They then took a final free recall test (Experiments 1 and 2 or recognition test (Experiment 3. In all experiments, children showed robust retrieval practice effects. Although a range of individual differences in reading comprehension and processing speed were observed among these children, the benefits of retrieval practice were independent of these factors. The results contribute to the growing body of research supporting the mnemonic benefits of retrieval practice and provide preliminary evidence that practicing retrieval may be an effective learning strategy for children with varying levels of reading comprehension and processing speed.

  13. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  14. Proposal for a New ’Rights in Software’ Clause for Software Acquisitions by the Department of Defense.

    Science.gov (United States)

    1986-09-01

    point here Is that the capital cost of design and development (including the cost of software tools and/or CAD/CAM programs which aided in the development...and capitalization , software Is in many ways more Ike a hardware component than it is Ike the tech- nical documentation which supports the hardware...Invoked, the owner of intelectual property rights in software may attach appropriate copyright notices to software delivered under this contract. 2.2.2

  15. Customer configuration updating in a software supply network

    NARCIS (Netherlands)

    Jansen, S.R.L.

    2007-01-01

    Product software development is the activity of development, modification, reuse, re-engineering, maintenance, or any other activities that result in packaged configurations of software components or software-based services that are released for and traded in a specific market \\cite{XuBrinkkemper}.

  16. ROSMOD: A Toolsuite for Modeling, Generating, Deploying, and Managing Distributed Real-time Component-based Software using ROS

    Directory of Open Access Journals (Sweden)

    Pranav Srinivas Kumar

    2016-09-01

    Full Text Available This paper presents the Robot Operating System Model-driven development tool suite, (ROSMOD an integrated development environment for rapid prototyping component-based software for the Robot Operating System (ROS middleware. ROSMOD is well suited for the design, development and deployment of large-scale distributed applications on embedded devices. We present the various features of ROSMOD including the modeling language, the graphical user interface, code generators, and deployment infrastructure. We demonstrate the utility of this tool with a real-world case study: an Autonomous Ground Support Equipment (AGSE robot that was designed and prototyped using ROSMOD for the NASA Student Launch competition, 2014–2015.

  17. A stochastic cloud model for cloud and ozone retrievals from UV measurements

    International Nuclear Information System (INIS)

    Efremenko, Dmitry S.; Schüssler, Olena; Doicu, Adrian; Loyola, Diego

    2016-01-01

    The new generation of satellite instruments provides measurements in and around the Oxygen A-band on a global basis and with a relatively high spatial resolution. These data are commonly used for the determination of cloud properties. A stochastic model and radiative transfer model, previously developed by the authors, is used as the forward model component in retrievals of cloud parameters and ozone total and partial columns. The cloud retrieval algorithm combines local and global optimization routines, and yields a retrieval accuracy of about 1% and a fast computational time. Retrieved parameters are the cloud optical thickness and the cloud-top height. It was found that the use of the independent pixel approximation instead of the stochastic cloud model leads to large errors in the retrieved cloud parameters, as well as, in the retrieved ozone height resolved partial columns. The latter can be reduced by using the stochastic cloud model to compute the optimal value of the regularization parameter in the framework of Tikhonov regularization. - Highlights: • A stochastic radiative transfer model for retrieving clouds/ozone is designed. • Errors of independent pixel approximation (IPA) for O3 total column are small. • The error of IPA for ozone profile retrieval may become large. • The use of stochastic model reduces the error of ozone profile retrieval.

  18. A system for automatic evaluation of simulation software

    Science.gov (United States)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  19. ABCD, an Open Source Software for Modern Libraries

    Directory of Open Access Journals (Sweden)

    Sangeeta Namdev Dhamdhere

    2011-12-01

    Full Text Available Nowadays, librarians are using various kinds of open source software for different purposes such as library automation, digitization, institutional repository, content management. ABCD, acronym for Automatisación de Bibliotécas y Centros de Documentación, is one of such software. It caters to almost all present needs of modern libraries of any sizes. It offers a solution to library automation with ISBD as well as local formats. It has excellent indexing and retrieval features based on UNESCO’s ISIS technology, a web OPAC, and a library Portal with integrated meta-search and content management system to manage online as well as offline digital resources and physical documents and media.

  20. Bayesian aerosol retrieval algorithm for MODIS AOD retrieval over land

    Science.gov (United States)

    Lipponen, Antti; Mielonen, Tero; Pitkänen, Mikko R. A.; Levy, Robert C.; Sawyer, Virginia R.; Romakkaniemi, Sami; Kolehmainen, Ville; Arola, Antti

    2018-03-01

    We have developed a Bayesian aerosol retrieval (BAR) algorithm for the retrieval of aerosol optical depth (AOD) over land from the Moderate Resolution Imaging Spectroradiometer (MODIS). In the BAR algorithm, we simultaneously retrieve all dark land pixels in a granule, utilize spatial correlation models for the unknown aerosol parameters, use a statistical prior model for the surface reflectance, and take into account the uncertainties due to fixed aerosol models. The retrieved parameters are total AOD at 0.55 µm, fine-mode fraction (FMF), and surface reflectances at four different wavelengths (0.47, 0.55, 0.64, and 2.1 µm). The accuracy of the new algorithm is evaluated by comparing the AOD retrievals to Aerosol Robotic Network (AERONET) AOD. The results show that the BAR significantly improves the accuracy of AOD retrievals over the operational Dark Target (DT) algorithm. A reduction of about 29 % in the AOD root mean square error and decrease of about 80 % in the median bias of AOD were found globally when the BAR was used instead of the DT algorithm. Furthermore, the fraction of AOD retrievals inside the ±(0.05+15 %) expected error envelope increased from 55 to 76 %. In addition to retrieving the values of AOD, FMF, and surface reflectance, the BAR also gives pixel-level posterior uncertainty estimates for the retrieved parameters. The BAR algorithm always results in physical, non-negative AOD values, and the average computation time for a single granule was less than a minute on a modern personal computer.

  1. BENCHMARKING MACHINE LEARNING TECHNIQUES FOR SOFTWARE DEFECT DETECTION

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine Learning approaches are good in solving problems that have less information. In most cases, the software domain problems characterize as a process of learning that depend on the various circumstances and changes accordingly. A predictive model is constructed by using machine learning approaches and classified them into defective and non-defective modules. Machine learning techniques help developers to retrieve useful information after the classification and enable them to analyse data...

  2. A retrieval algorithm of hydrometer profile for submillimeter-wave radiometer

    Science.gov (United States)

    Liu, Yuli; Buehler, Stefan; Liu, Heguang

    2017-04-01

    Vertical profiles of particle microphysics perform vital functions for the estimation of climatic feedback. This paper proposes a new algorithm to retrieve the profile of the parameters of the hydrometeor(i.e., ice, snow, rain, liquid cloud, graupel) based on passive submillimeter-wave measurements. These parameters include water content and particle size. The first part of the algorithm builds the database and retrieves the integrated quantities. Database is built up by Atmospheric Radiative Transfer Simulator(ARTS), which uses atmosphere data to simulate the corresponding brightness temperature. Neural network, trained by the precalculated database, is developed to retrieve the water path for each type of particles. The second part of the algorithm analyses the statistical relationship between water path and vertical parameters profiles. Based on the strong dependence existing between vertical layers in the profiles, Principal Component Analysis(PCA) technique is applied. The third part of the algorithm uses the forward model explicitly to retrieve the hydrometeor profiles. Cost function is calculated in each iteration, and Differential Evolution(DE) algorithm is used to adjust the parameter values during the evolutionary process. The performance of this algorithm is planning to be verified for both simulation database and measurement data, by retrieving profiles in comparison with the initial one. Results show that this algorithm has the ability to retrieve the hydrometeor profiles efficiently. The combination of ARTS and optimization algorithm can get much better results than the commonly used database approach. Meanwhile, the concept that ARTS can be used explicitly in the retrieval process shows great potential in providing solution to other retrieval problems.

  3. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  4. Organizing Scientific Information with Mendeley© software

    OpenAIRE

    Vlčková Kateřina; Lojdová Kateřina; Mareš Jan

    2013-01-01

    The main goal of this workshop was to provide academics and doctoral students the knowledge and skills necessary to efficiently organize and retrieve information using Mendeley© software. During the research process, the next step after finding information, is organize it. In this digital era, our skills to efficiently find information need to be different from those of the print era (Tuominen, 2007). The same applies to the subsequent process of organization of that information. Furthermore,...

  5. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  6. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  7. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  8. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  9. A Four–Component Model of Age–Related Memory Change

    Science.gov (United States)

    Healey, M. Karl; Kahana, Michael J.

    2015-01-01

    We develop a novel, computationally explicit, theory of age–related memory change within the framework of the context maintenance and retrieval (CMR2) model of memory search. We introduce a set of benchmark findings from the free recall and recognition tasks that includes aspects of memory performance that show both age-related stability and decline. We test aging theories by lesioning the corresponding mechanisms in a model fit to younger adult free recall data. When effects are considered in isolation, many theories provide an adequate account, but when all effects are considered simultaneously, the existing theories fail. We develop a novel theory by fitting the full model (i.e., allowing all parameters to vary) to individual participants and comparing the distributions of parameter values for older and younger adults. This theory implicates four components: 1) the ability to sustain attention across an encoding episode, 2) the ability to retrieve contextual representations for use as retrieval cues, 3) the ability to monitor retrievals and reject intrusions, and 4) the level of noise in retrieval competitions. We extend CMR2 to simulate a recognition memory task using the same mechanisms the free recall model uses to reject intrusions. Without fitting any additional parameters, the four–component theory that accounts for age differences in free recall predicts the magnitude of age differences in recognition memory accuracy. Confirming a prediction of the model, free recall intrusion rates correlate positively with recognition false alarm rates. Thus we provide a four–component theory of a complex pattern of age differences across two key laboratory tasks. PMID:26501233

  10. The Evolution of Software Publication in Astronomy

    Science.gov (United States)

    Cantiello, Matteo

    2018-01-01

    Software is a fundamental component of the scientific research process. As astronomical discoveries increasingly rely on complex numerical calculations and the analysis of big data sets, publishing and documenting software is a fundamental step in ensuring transparency and reproducibility of results. I will briefly discuss the recent history of software publication and highlight the challenges and opportunities ahead.

  11. Enhanced Deep Blue Aerosol Retrieval Algorithm: The Second Generation

    Science.gov (United States)

    Hsu, N. C.; Jeong, M.-J.; Bettenhausen, C.; Sayer, A. M.; Hansell, R.; Seftor, C. S.; Huang, J.; Tsay, S.-C.

    2013-01-01

    The aerosol products retrieved using the MODIS collection 5.1 Deep Blue algorithm have provided useful information about aerosol properties over bright-reflecting land surfaces, such as desert, semi-arid, and urban regions. However, many components of the C5.1 retrieval algorithm needed to be improved; for example, the use of a static surface database to estimate surface reflectances. This is particularly important over regions of mixed vegetated and non- vegetated surfaces, which may undergo strong seasonal changes in land cover. In order to address this issue, we develop a hybrid approach, which takes advantage of the combination of pre-calculated surface reflectance database and normalized difference vegetation index in determining the surface reflectance for aerosol retrievals. As a result, the spatial coverage of aerosol data generated by the enhanced Deep Blue algorithm has been extended from the arid and semi-arid regions to the entire land areas.

  12. A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.

    Science.gov (United States)

    Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.

    The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.

  13. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  14. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  15. Challenges of the Open Source Component Marketplace in the Industry

    Science.gov (United States)

    Ayala, Claudia; Hauge, Øyvind; Conradi, Reidar; Franch, Xavier; Li, Jingyue; Velle, Ketil Sandanger

    The reuse of Open Source Software components available on the Internet is playing a major role in the development of Component Based Software Systems. Nevertheless, the special nature of the OSS marketplace has taken the “classical” concept of software reuse based on centralized repositories to a completely different arena based on massive reuse over Internet. In this paper we provide an overview of the actual state of the OSS marketplace, and report preliminary findings about how companies interact with this marketplace to reuse OSS components. Such data was gathered from interviews in software companies in Spain and Norway. Based on these results we identify some challenges aimed to improve the industrial reuse of OSS components.

  16. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  17. Software qualification in safety applications

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    2000-01-01

    The developers of safety-critical instrumentation and control systems must qualify the design of the components used, including the software in the embedded computer systems, in order to ensure that the component can be trusted to perform its safety function under the full range of operating conditions. There are well known ways to qualify analog systems using the facts that: (1) they are built from standard modules with known properties; (2) design documents are available and described in a well understood language; (3) the performance of the component is constrained by physics; and (4) physics models exist to predict the performance. These properties are not generally available for qualifying software, and one must fall back on extensive testing and qualification of the design process. Neither of these is completely satisfactory. The research reported here is exploring an alternative approach that is intended to permit qualification for an important subset of instrumentation software. The research goal is to determine if a combination of static analysis and limited testing can be used to qualify a class of simple, but practical, computer-based instrumentation components for safety application. These components are of roughly the complexity of a motion detector alarm controller. This goal is accomplished by identifying design constraints that enable meaningful analysis and testing. Once such design constraints are identified, digital systems can be designed to allow for analysis and testing, or existing systems may be tested for conformance to the design constraints as a first step in a qualification process. This will considerably reduce the cost and monetary risk involved in qualifying commercial components for safety-critical service

  18. First International Workshop on Variability in Software Architecture (VARSA 2011)

    NARCIS (Netherlands)

    Galster, Matthias; Avgeriou, Paris; Weyns, Danny; Mannisto, Tomi

    2011-01-01

    Variability is the ability of a software artifact to be changed for a specific context. Mechanisms to accommodate variability include software product lines, configuration wizards and tools in commercial software, configuration interfaces of software components, or the dynamic runtime composition of

  19. Contrasting Views of Software Engineering Journals: Author Cocitation Choices and Indexer Vocabulary Assignments.

    Science.gov (United States)

    Marion, Linda S.; McCain, Katherine W.

    2001-01-01

    Explores the intellectual subject structure and research themes in software engineering through the identification and analysis of a core journal literature via two expert perspectives: the author, through cocitation analysis; and the indexer, through subject terms selected to facilitate retrieval. Topics include cluster analysis, multidimensional…

  20. Web accessibility and open source software.

    Science.gov (United States)

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  1. Increased gamma band power during movement planning coincides with motor memory retrieval.

    Science.gov (United States)

    Thürer, Benjamin; Stockinger, Christian; Focke, Anne; Putze, Felix; Schultz, Tanja; Stein, Thorsten

    2016-01-15

    The retrieval of motor memory requires a previous memory encoding and subsequent consolidation of the specific motor memory. Previous work showed that motor memory seems to rely on different memory components (e.g., implicit, explicit). However, it is still unknown if explicit components contribute to the retrieval of motor memories formed by dynamic adaptation tasks and which neural correlates are linked to memory retrieval. We investigated the lower and higher gamma bands of subjects' electroencephalography during encoding and retrieval of a dynamic adaptation task. A total of 24 subjects were randomly assigned to a treatment and control group. Both groups adapted to a force field A on day 1 and were re-exposed to the same force field A on day 3 of the experiment. On day 2, treatment group learned an interfering force field B whereas control group had a day rest. Kinematic analyses showed that control group improved their initial motor performance from day 1 to day 3 but treatment group did not. This behavioral result coincided with an increased higher gamma band power in the electrodes over prefrontal areas on the initial trials of day 3 for control but not treatment group. Intriguingly, this effect vanished with the subsequent re-adaptation on day 3. We suggest that improved re-test performance in a dynamic motor adaptation task is contributed by explicit memory and that gamma bands in the electrodes over the prefrontal cortex are linked to these explicit components. Furthermore, we suggest that the contribution of explicit memory vanishes with the subsequent re-adaptation while task automaticity increases. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Retrieval Attempts Enhance Learning, but Retrieval Success (versus Failure) Does Not Matter

    Science.gov (United States)

    Kornell, Nate; Klein, Patricia Jacobs; Rawson, Katherine A.

    2015-01-01

    Retrieving information from memory enhances learning. We propose a 2-stage framework to explain the benefits of retrieval. Stage 1 takes place as one attempts to retrieve an answer, which activates knowledge related to the retrieval cue. Stage 2 begins when the answer becomes available, at which point appropriate connections are strengthened and…

  3. Interactions among emotional attention, encoding, and retrieval of ambiguous information: An eye-tracking study.

    Science.gov (United States)

    Everaert, Jonas; Koster, Ernst H W

    2015-10-01

    Emotional biases in attention modulate encoding of emotional material into long-term memory, but little is known about the role of such attentional biases during emotional memory retrieval. The present study investigated how emotional biases in memory are related to attentional allocation during retrieval. Forty-nine individuals encoded emotionally positive and negative meanings derived from ambiguous information and then searched their memory for encoded meanings in response to a set of retrieval cues. The remember/know/new procedure was used to classify memories as recollection-based or familiarity-based, and gaze behavior was monitored throughout the task to measure attentional allocation. We found that a bias in sustained attention during recollection-based, but not familiarity-based, retrieval predicted subsequent memory bias toward positive versus negative material following encoding. Thus, during emotional memory retrieval, attention affects controlled forms of retrieval (i.e., recollection) but does not modulate relatively automatic, familiarity-based retrieval. These findings enhance understanding of how distinct components of attention regulate the emotional content of memories. Implications for theoretical models and emotion regulation are discussed. (c) 2015 APA, all rights reserved).

  4. Retrieval options study

    International Nuclear Information System (INIS)

    1980-03-01

    This Retrieval Options Study is part of the systems analysis activities of the Office of Nuclear Waste Isolation to develop the scientific and technological bases for radioactive waste repositories in various geologic media. The study considers two waste forms, high level waste and spent fuel, and defines various classes of waste retrieval and recovery. A methodology and data base are developed which allow the relative evaluation of retrieval and recovery costs and the following technical criteria: safety; technical feasibility; ease of retrieval; probable intact retrieval time; safeguards; monitoring; criticality; and licensability. A total of 505 repository options are defined and the cost and technical criteria evaluated utilizing a combination of facts and engineering judgments. The repositories evaluated are selected combinations of the following parameters: Geologic Media (salt, granite, basalt, shale); Retrieval Time after Emplacement (5 and 25 years); Emplacement Design (nominal hole, large hole, carbon steel canister, corrosion resistant canister, backfill in hole, nominal sleeves, thick wall sleeves); Emplacement Configuration (single vertical, multiple vertical, single horizontal, multiple horizontal, vaults; Thermal Considerations; (normal design, reduced density, once-through ventilation, recirculated ventilation); Room Backfill; (none, run-of-mine, early, 5 year delay, 25 year delay, decommissioned); and Rate of Retrieval;

  5. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  6. Technology for an intelligent, free-flying robot for crew and equipment retrieval in space

    Science.gov (United States)

    Erickson, J. D.; Reuter, G. J.; Healey, Kathleen J.; Phinney, D. E.

    1990-01-01

    Crew rescue and equipment retrieval is a Space Station Freedom requirement. During Freedom's lifetime, there is a high probability that a number of objects will accidently become separated. Members of the crew, replacement units, and key tools are examples. Retrieval of these objects within a short time is essential. Systems engineering studies were conducted to identify system requirements and candidate approaches. One such approach, based on a voice-supervised, intelligent, free-flying robot was selected for further analysis. A ground-based technology demonstration, now in its second phase, was designed to provide an integrated robotic hardware and software testbed supporting design of a space-borne system. The ground system, known as the EVA Retriever, is examining the problem of autonomously planning and executing a target rendezvous, grapple, and return to base while avoiding stationary and moving obstacles. The current prototype is an anthropomorphic manipulator unit with dexterous arms and hands attached to a robot body and latched in a manned maneuvering unit. A precision air-bearing floor is used to simulate space. Sensor data include two vision systems and force/proximity/tactile sensors on the hands and arms. Planning for a shuttle file experiment is underway. A set of scenarios and strawman requirements were defined to support conceptual development. Initial design activities are expected to begin in late 1989 with the flight occurring in 1994. The flight hardware and software will be based on lessons learned from both the ground prototype and computer simulations.

  7. Retrieval and phenomenology of autobiographical memories in blind individuals.

    Science.gov (United States)

    Tekcan, Ali Í; Yılmaz, Engin; Kızılöz, Burcu Kaya; Karadöller, Dilay Z; Mutafoğlu, Merve; Erciyes, Aslı Aktan

    2015-01-01

    Although visual imagery is argued to be an essential component of autobiographical memory, there have been surprisingly few studies on autobiographical memory processes in blind individuals, who have had no or limited visual input. The purpose of the present study was to investigate how blindness affects retrieval and phenomenology of autobiographical memories. We asked 48 congenital/early blind and 48 sighted participants to recall autobiographical memories in response to six cue words, and to fill out the Autobiographical Memory Questionnaire measuring a number of variables including imagery, belief and recollective experience associated with each memory. Blind participants retrieved fewer memories and reported higher auditory imagery at retrieval than sighted participants. Moreover, within the blind group, participants with total blindness reported higher auditory imagery than those with some light perception. Blind participants also assigned higher importance, belief and recollection ratings to their memories than sighted participants. Importantly, these group differences remained the same for recent as well as childhood memories.

  8. Detecting Genomic Signatures of Natural Selection with Principal Component Analysis: Application to the 1000 Genomes Data.

    Science.gov (United States)

    Duforet-Frebourg, Nicolas; Luu, Keurcien; Laval, Guillaume; Bazin, Eric; Blum, Michael G B

    2016-04-01

    To characterize natural selection, various analytical methods for detecting candidate genomic regions have been developed. We propose to perform genome-wide scans of natural selection using principal component analysis (PCA). We show that the common FST index of genetic differentiation between populations can be viewed as the proportion of variance explained by the principal components. Considering the correlations between genetic variants and each principal component provides a conceptual framework to detect genetic variants involved in local adaptation without any prior definition of populations. To validate the PCA-based approach, we consider the 1000 Genomes data (phase 1) considering 850 individuals coming from Africa, Asia, and Europe. The number of genetic variants is of the order of 36 millions obtained with a low-coverage sequencing depth (3×). The correlations between genetic variation and each principal component provide well-known targets for positive selection (EDAR, SLC24A5, SLC45A2, DARC), and also new candidate genes (APPBPP2, TP1A1, RTTN, KCNMA, MYO5C) and noncoding RNAs. In addition to identifying genes involved in biological adaptation, we identify two biological pathways involved in polygenic adaptation that are related to the innate immune system (beta defensins) and to lipid metabolism (fatty acid omega oxidation). An additional analysis of European data shows that a genome scan based on PCA retrieves classical examples of local adaptation even when there are no well-defined populations. PCA-based statistics, implemented in the PCAdapt R package and the PCAdapt fast open-source software, retrieve well-known signals of human adaptation, which is encouraging for future whole-genome sequencing project, especially when defining populations is difficult. © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  9. Hippocampal-prefrontal engagement and dynamic causal interactions in the maturation of children's fact retrieval.

    Science.gov (United States)

    Cho, Soohyun; Metcalfe, Arron W S; Young, Christina B; Ryali, Srikanth; Geary, David C; Menon, Vinod

    2012-09-01

    Children's gains in problem-solving skills during the elementary school years are characterized by shifts in the mix of problem-solving approaches, with inefficient procedural strategies being gradually replaced with direct retrieval of domain-relevant facts. We used a well-established procedure for strategy assessment during arithmetic problem solving to investigate the neural basis of this critical transition. We indexed behavioral strategy use by focusing on the retrieval frequency and examined changes in brain activity and connectivity associated with retrieval fluency during arithmetic problem solving in second- and third-grade (7- to 9-year-old) children. Children with higher retrieval fluency showed elevated signal in the right hippocampus, parahippocampal gyrus (PHG), lingual gyrus (LG), fusiform gyrus (FG), left ventrolateral PFC (VLPFC), bilateral dorsolateral PFC (DLPFC), and posterior angular gyrus. Critically, these effects were not confounded by individual differences in problem-solving speed or accuracy. Psychophysiological interaction analysis revealed significant effective connectivity of the right hippocampus with bilateral VLPFC and DLPFC during arithmetic problem solving. Dynamic causal modeling analysis revealed strong bidirectional interactions between the hippocampus and the left VLPFC and DLPFC. Furthermore, causal influences from the left VLPFC to the hippocampus served as the main top-down component, whereas causal influences from the hippocampus to the left DLPFC served as the main bottom-up component of this retrieval network. Our study highlights the contribution of hippocampal-prefrontal circuits to the early development of retrieval fluency in arithmetic problem solving and provides a novel framework for studying dynamic developmental processes that accompany children's development of problem-solving skills.

  10. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  11. Global validation of two-channel AVHRR aerosol optical thickness retrievals over the oceans

    International Nuclear Information System (INIS)

    Liu Li; Mishchenko, Michael I.; Geogdzhayev, Igor; Smirnov, Alexander; Sakerin, Sergey M.; Kabanov, Dmitry M.; Ershov, Oleg A.

    2004-01-01

    The paper presents validation results for the aerosol optical thickness derived by applying a two-channel retrieval algorithm to Advanced Very High Resolution Radiometer (AVHRR) radiance data. The satellite retrievals are compared with ship-borne sun-photometer results. The comparison of spatial and temporal statistics of the AVHRR results and the ship measurements shows a strong correlation. The satellite retrieval results obtained with the original algorithm for a wavelength of 0.55μm are systematically higher than the sun-photometer measurements in the cases of low aerosol loads. The ensemble averaged satellite-retrieved optical thickness overestimates the ensemble averaged sun-photometer data by about 11% with a random error of about 0.04. Increasing the diffuse component of the ocean surface reflectance from 0.002 to 0.004 in the AVHRR algorithm produces a better match, with the ensemble-averaged AVHRR-retrieved optical thickness differing by only about 3.6% from the sun-photometer truth and having a small offset of 0.03

  12. A four-component model of age-related memory change.

    Science.gov (United States)

    Healey, M Karl; Kahana, Michael J

    2016-01-01

    We develop a novel, computationally explicit, theory of age-related memory change within the framework of the context maintenance and retrieval (CMR2) model of memory search. We introduce a set of benchmark findings from the free recall and recognition tasks that include aspects of memory performance that show both age-related stability and decline. We test aging theories by lesioning the corresponding mechanisms in a model fit to younger adult free recall data. When effects are considered in isolation, many theories provide an adequate account, but when all effects are considered simultaneously, the existing theories fail. We develop a novel theory by fitting the full model (i.e., allowing all parameters to vary) to individual participants and comparing the distributions of parameter values for older and younger adults. This theory implicates 4 components: (a) the ability to sustain attention across an encoding episode, (b) the ability to retrieve contextual representations for use as retrieval cues, (c) the ability to monitor retrievals and reject intrusions, and (d) the level of noise in retrieval competitions. We extend CMR2 to simulate a recognition memory task using the same mechanisms the free recall model uses to reject intrusions. Without fitting any additional parameters, the 4-component theory that accounts for age differences in free recall predicts the magnitude of age differences in recognition memory accuracy. Confirming a prediction of the model, free recall intrusion rates correlate positively with recognition false alarm rates. Thus, we provide a 4-component theory of a complex pattern of age differences across 2 key laboratory tasks. (c) 2015 APA, all rights reserved).

  13. Software reliability prediction using SPN | Abbasabadee | Journal of ...

    African Journals Online (AJOL)

    Software reliability prediction using SPN. ... In this research for computation of software reliability, component reliability model based on SPN would be proposed. An isomorphic markov ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  14. Towards a Component Based Model for Database Systems

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2004-02-01

    Full Text Available Due to their effectiveness in the design and development of software applications and due to their recognized advantages in terms of reusability, Component-Based Software Engineering (CBSE concepts have been arousing a great deal of interest in recent years. This paper presents and extends a component-based approach to object-oriented database systems (OODB introduced by us in [1] and [2]. Components are proposed as a new abstraction level for database system, logical partitions of the schema. In this context, the scope is introduced as an escalated property for transactions. Components are studied from the integrity, consistency, and concurrency control perspective. The main benefits of our proposed component model for OODB are the reusability of the database design, including the access statistics required for a proper query optimization, and a smooth information exchange. The integration of crosscutting concerns into the component database model using aspect-oriented techniques is also discussed. One of the main goals is to define a method for the assessment of component composition capabilities. These capabilities are restricted by the component’s interface and measured in terms of adaptability, degree of compose-ability and acceptability level. The above-mentioned metrics are extended from database components to generic software components. This paper extends and consolidates into one common view the ideas previously presented by us in [1, 2, 3].[1] Octavian Paul Rotaru, Marian Dobre, Component Aspects in Object Oriented Databases, Proceedings of the International Conference on Software Engineering Research and Practice (SERP’04, Volume II, ISBN 1-932415-29-7, pages 719-725, Las Vegas, NV, USA, June 2004.[2] Octavian Paul Rotaru, Marian Dobre, Mircea Petrescu, Integrity and Consistency Aspects in Component-Oriented Databases, Proceedings of the International Symposium on Innovation in Information and Communication Technology (ISIICT

  15. Less we forget: retrieval cues and release from retrieval-induced forgetting.

    Science.gov (United States)

    Jonker, Tanya R; Seli, Paul; Macleod, Colin M

    2012-11-01

    Retrieving some items from memory can impair the subsequent recall of other related but not retrieved items, a phenomenon called retrieval-induced forgetting (RIF). The dominant explanation of RIF-the inhibition account-asserts that forgetting occurs because related items are suppressed during retrieval practice to reduce retrieval competition. This item inhibition persists, making it more difficult to recall the related items on a later test. In our set of experiments, each category was designed such that each exemplar belonged to one of two subcategories (e.g., each BIRD exemplar was either a bird of prey or a pet bird), but this subcategory information was not made explicit during study or retrieval practice. Practicing retrieval of items from only one subcategory led to RIF for items from the other subcategory when cued only with the overall category label (BIRD) at test. However, adapting the technique of Gardiner, Craik, and Birtwistle (Journal of Learning and Verbal Behavior 11:778-783, 1972), providing subcategory cues during the final test eliminated RIF. The results challenge the inhibition account's fundamental assumption of cue independence but are consistent with a cue-based interference account.

  16. Relating the new language models of information retrieval to the traditional retrieval models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Vries, A.P.

    During the last two years, exciting new approaches to information retrieval were introduced by a number of different research groups that use statistical language models for retrieval. This paper relates the retrieval algorithms suggested by these approaches to widely accepted retrieval algorithms

  17. Guidelines for the verification and validation of expert system software and conventional software. Volume 7, User's manual: Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    Reliable software is required for nuclear power industry applications. Verification and validation techniques applied during the software development process can help eliminate errors that could inhibit the proper operation of digital systems and cause availability and safety problems. Most of the techniques described in this report are valid for conventional software systems as well as for expert systems. The project resulted in a set of 16 V ampersand V guideline packages and 11 sets of procedures based on the class, development phase, and system component being tested. These guideline packages and procedures help a utility define the level of V ampersand V, which involves evaluating the complexity and type of software component along with the consequences of failure. In all, the project identified 153 V ampersand V techniques for conventional software systems and demonstrated their application to all aspects of expert systems except for the knowledge base, which requires specially developed tools. Each of these conventional techniques covers anywhere from 2-52 total types of conventional software defects, and each defect is covered by 21-50 V ampersand V techniques. The project also identified automated tools to Support V ampersand V activities

  18. Empirical usability testing in a component-based environment : improving test efficiency with component-specific usability measures

    NARCIS (Netherlands)

    Brinkman, W.P.; Haakma, R.; Bouwhuis, D.G.; Bastide, R.; Palanque, P.; Roth, J.

    2005-01-01

    This paper addresses the issue of usability testing in a component-based software engineering environment, specifically measuring the usability of different versions of a component in a more powerful manner than other, more holistic, usability methods. Three component-specific usability measures are

  19. Gazing through Windows at component software development

    International Nuclear Information System (INIS)

    Foster, David

    1996-01-01

    What has been presented here is an overview of the architectural plan for distributed computing by Microsoft. The business opportunity is tied to the rapid growth of consumer computing which is happening now and will continue far into the future. Being able to create a logically centralized, through the use of interface standards, and physically distributed computing environment where anyone can provide services is major challenge. Managing complexity and creating a consistent framework through the use of componentware technology is paramount to its success. The ability to scale distributed processing, manage diverse groups involved in data analysis and facilitate collaboration at all levels are the business processes of particular interest to the HEP community. In realizing the business opportunity they see, Microsoft and others, will help solve many of the basic problems facing HEP in the next ten years. By closely tracking the software developments and investing in understanding the technologies presented here, HEP will gain great benefit from commodity computing. (author)

  20. Retrieval options study

    Energy Technology Data Exchange (ETDEWEB)

    1980-03-01

    This Retrieval Options Study is part of the systems analysis activities of the Office of Nuclear Waste Isolation to develop the scientific and technological bases for radioactive waste repositories in various geologic media. The study considers two waste forms, high level waste and spent fuel, and defines various classes of waste retrieval and recovery. A methodology and data base are developed which allow the relative evaluation of retrieval and recovery costs and the following technical criteria: safety; technical feasibility; ease of retrieval; probable intact retrieval time; safeguards; monitoring; criticality; and licensability. A total of 505 repository options are defined and the cost and technical criteria evaluated utilizing a combination of facts and engineering judgments. The repositories evaluated are selected combinations of the following parameters: Geologic Media (salt, granite, basalt, shale); Retrieval Time after Emplacement (5 and 25 years); Emplacement Design (nominal hole, large hole, carbon steel canister, corrosion resistant canister, backfill in hole, nominal sleeves, thick wall sleeves); Emplacement Configuration (single vertical, multiple vertical, single horizontal, multiple horizontal, vaults; Thermal Considerations; (normal design, reduced density, once-through ventilation, recirculated ventilation); Room Backfill; (none, run-of-mine, early, 5 year delay, 25 year delay, decommissioned); and Rate of Retrieval; (same as emplacement, variably slower depending on repository/canister condition).

  1. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  2. Gunther Tulip Retrievable Inferior Vena Caval Filters: Indications, Efficacy, Retrieval, and Complications

    International Nuclear Information System (INIS)

    Looby, S.; Given, M.F.; Geoghegan, T.; McErlean, A.; Lee, M.J.

    2007-01-01

    Purpose. We evaluated the Gunther Tulip (GT) retrievable inferior vena cava (IVC) filter with regard to indications, filtration efficacy, complications, retrieval window, and use of anticoagulation. Method. A retrospective study was performed of 147 patients (64 men, 83 women; mean age 58.8 years) who underwent retrievable GT filter insertion between 2001 and 2005. The indications for placement included a diagnosis of pulmonary embolism or deep venous thrombosis with a contraindication to anticoagulation (n = 68), pulmonary embolism or deep venous thrombosis while on anticoagulation (n = 49), prophylactic filter placement for high-risk surgical patients with a past history of pulmonary embolism or deep venous thrombosis (n = 20), and a high risk of pulmonary embolism or deep venous thrombosis (n = 10). Forty-nine of the 147 patients did not receive anticoagulation (33.7%) while 96 of 147 patients did, 82 of these receiving warfarin (56.5%), 11 receiving low-molecular weight heparins (7.58%), and 3 receiving antiplatelet agents alone (2.06%). Results. Filter placement was successful in 147 patients (100%). Two patients had two filters inserted. Of the 147 patients, filter deployment was on a permanent basis in 102 and with an intention to retrieve in 45 patients. There were 36 (80%) successful retrievals and 9 (20%) failed retrievals. The mean time to retrieval was 33.6 days. The reasons for failed retrieval included filter struts tightly adherent to the IVC wall (5/9), extreme filter tilt (2/9), and extensive filter thrombus (2/9). Complications included pneumothorax (n = 4), failure of filter expansion (n = 1), and breakthrough pulmonary embolism (n = 1). No IVC thrombotic episodes were recorded. Discussion. The Gunther Tulip retrievable filter can be used as a permanent or a retrievable filter. It is safe and efficacious. GT filters can be safely retrieved at a mean time interval of 33.6 days. The newly developed Celect filter may extend the retrieval interval

  3. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance

  4. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  5. Health software: a new CEI Guide for software management in medical environment.

    Science.gov (United States)

    Giacomozzi, Claudia; Martelli, Francesco

    2016-01-01

    The increasing spread of software components in the healthcare context renders explanatory guides relevant and mandatory to interpret laws and standards, and to support safe management of software products in healthcare. In 2012 a working group has been settled for the above purposes at Italian Electrotechnical Committee (CEI), made of experts from Italian National Institute of Health (ISS), representatives of industry, and representatives of the healthcare organizations. As a first outcome of the group activity, Guide CEI 62-237 was published in February 2015. The Guide incorporates an innovative approach based on the proper contextualization of software products, either medical devices or not, to the specific healthcare scenario, and addresses the risk management of IT systems. The Guide provides operators and manufacturers with an interpretative support with many detailed examples to facilitate the proper contextualization and management of health software, in compliance with related European and international regulations and standards.

  6. X-ray differential phase-contrast tomographic reconstruction with a phase line integral retrieval filter

    International Nuclear Information System (INIS)

    Fu, Jian; Hu, Xinhua; Li, Chen

    2015-01-01

    We report an alternative reconstruction technique for x-ray differential phase-contrast computed tomography (DPC-CT). This approach is based on a new phase line integral projection retrieval filter, which is rooted in the derivative property of the Fourier transform and counteracts the differential nature of the DPC-CT projections. It first retrieves the phase line integral from the DPC-CT projections. Then the standard filtered back-projection (FBP) algorithms popular in x-ray absorption-contrast CT are directly applied to the retrieved phase line integrals to reconstruct the DPC-CT images. Compared with the conventional DPC-CT reconstruction algorithms, the proposed method removes the Hilbert imaginary filter and allows for the direct use of absorption-contrast FBP algorithms. Consequently, FBP-oriented image processing techniques and reconstruction acceleration softwares that have already been successfully used in absorption-contrast CT can be directly adopted to improve the DPC-CT image quality and speed up the reconstruction

  7. An expert system based software sizing tool, phase 2

    Science.gov (United States)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  8. Model Driven Software Development for Agricultural Robotics

    DEFF Research Database (Denmark)

    Larsen, Morten

    The design and development of agricultural robots, consists of both mechan- ical, electrical and software components. All these components must be de- signed and combined such that the overall goal of the robot is fulfilled. The design and development of these systems require collaboration between...... processing, control engineering, etc. This thesis proposes a Model-Driven Software Develop- ment based approach to model, analyse and partially generate the software implementation of a agricultural robot. Furthermore, Guidelines for mod- elling the architecture of an agricultural robots are provided......, assisting with bridging the different engineering disciplines. Timing play an important role in agricultural robotic applications, synchronisation of robot movement and implement actions is important in order to achieve precision spraying, me- chanical weeding, individual feeding, etc. Discovering...

  9. Model-based magnetization retrieval from holographic phase images

    Energy Technology Data Exchange (ETDEWEB)

    Röder, Falk, E-mail: f.roeder@hzdr.de [Helmholtz-Zentrum Dresden-Rossendorf, Institut für Ionenstrahlphysik und Materialforschung, Bautzner Landstr. 400, D-01328 Dresden (Germany); Triebenberg Labor, Institut für Strukturphysik, Technische Universität Dresden, D-01062 Dresden (Germany); Vogel, Karin [Triebenberg Labor, Institut für Strukturphysik, Technische Universität Dresden, D-01062 Dresden (Germany); Wolf, Daniel [Helmholtz-Zentrum Dresden-Rossendorf, Institut für Ionenstrahlphysik und Materialforschung, Bautzner Landstr. 400, D-01328 Dresden (Germany); Triebenberg Labor, Institut für Strukturphysik, Technische Universität Dresden, D-01062 Dresden (Germany); Hellwig, Olav [Helmholtz-Zentrum Dresden-Rossendorf, Institut für Ionenstrahlphysik und Materialforschung, Bautzner Landstr. 400, D-01328 Dresden (Germany); AG Magnetische Funktionsmaterialien, Institut für Physik, Technische Universität Chemnitz, D-09126 Chemnitz (Germany); HGST, A Western Digital Company, 3403 Yerba Buena Rd., San Jose, CA 95135 (United States); Wee, Sung Hun [HGST, A Western Digital Company, 3403 Yerba Buena Rd., San Jose, CA 95135 (United States); Wicht, Sebastian; Rellinghaus, Bernd [IFW Dresden, Institute for Metallic Materials, P.O. Box 270116, D-01171 Dresden (Germany)

    2017-05-15

    The phase shift of the electron wave is a useful measure for the projected magnetic flux density of magnetic objects at the nanometer scale. More important for materials science, however, is the knowledge about the magnetization in a magnetic nano-structure. As demonstrated here, a dominating presence of stray fields prohibits a direct interpretation of the phase in terms of magnetization modulus and direction. We therefore present a model-based approach for retrieving the magnetization by considering the projected shape of the nano-structure and assuming a homogeneous magnetization therein. We apply this method to FePt nano-islands epitaxially grown on a SrTiO{sub 3} substrate, which indicates an inclination of their magnetization direction relative to the structural easy magnetic [001] axis. By means of this real-world example, we discuss prospects and limits of this approach. - Highlights: • Retrieval of the magnetization from holographic phase images. • Magnetostatic model constructed for a magnetic nano-structure. • Decomposition into homogeneously magnetized components. • Discretization of a each component by elementary cuboids. • Analytic solution for the phase of a magnetized cuboid considered. • Fitting a set of magnetization vectors to experimental phase images.

  10. Real-time software for multi-isotopic source term estimation

    International Nuclear Information System (INIS)

    Goloubenkov, A.; Borodin, R.; Sohier, A.

    1996-01-01

    Consideration is given to development of software for one of crucial components of the RODOS - assessment of the source rate (SR) from indirect measurements. Four components of the software are described in the paper. First component is a GRID system, which allow to prepare stochastic meteorological and radioactivity fields using measured data. Second part is a model of atmospheric transport which can be adapted for emulation of practically any gamma dose/spectrum detectors. The third one is a method which allows space-time and quantitative discrepancies in measured and modelled data to be taken into account simultaneously. It bases on the preference scheme selected by an expert. Last component is a special optimization method for calculation of multi-isotopic SR and its uncertainties. Results of a validation of the software using tracer experiments data and Chernobyl source estimation for main dose-forming isotopes are enclosed in the paper

  11. RETRIEVAL EVENTS EVALUATION

    International Nuclear Information System (INIS)

    Wilson, T.

    1999-01-01

    The purpose of this analysis is to evaluate impacts to the retrieval concept presented in the Design Analysis ''Retrieval Equipment and Strategy'' (Reference 6), from abnormal events based on Design Basis Events (DBE) and Beyond Design Basis Events (BDBE) as defined in two recent analyses: (1) DBE/Scenario Analysis for Preclosure Repository Subsurface Facilities (Reference 4); and (2) Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (Reference 5) The objective of this task is to determine what impacts the DBEs and BDBEs have on the equipment developed for retrieval. The analysis lists potential impacts and recommends changes to be analyzed in subsequent design analyses for developed equipment, or recommend where additional equipment may be needed, to allow retrieval to be performed in all DBE or BDBE situations. This analysis supports License Application design and therefore complies with the requirements of Systems Description Document input criteria comparison as presented in Section 7, Conclusions. In addition, the analysis discusses the impacts associated with not using concrete inverts in the emplacement drifts. The ''Retrieval Equipment and Strategy'' analysis was based on a concrete invert configuration in the emplacement drift. The scope of the analysis, as presented in ''Development Plan for Retrieval Events Evaluation'' (Reference 3) includes evaluation and criteria of the following: Impacts to retrieval from the emplacement drift based on DBE/BDBEs, and changes to the invert configuration for the preclosure period. Impacts to retrieval from the main drifts based on DBE/BDBEs for the preclosure period

  12. Desenvolupament d'un sistema de monitoratge per a Software Defined Networks (SDN)

    OpenAIRE

    Navarro Sánchez, Albert

    2015-01-01

    Les Software Defined-Networks (SDN) són una tecnologia emergent que permet als components software estendre funcionalitats sobre la xarxa. En aquest projecte s'estudia com pot encaixar Polygraph, un sistema de monitoratge per a xarxes tradicionals, dins les SDN. Software Defined-Networks (SDN) are a new tecnology that allows software components to add new functionalities on networks. This project studies how Polygraph, a monitoring system for traditional networks, can fit in a SDN scenario.

  13. Private information retrieval

    CERN Document Server

    Yi, Xun; Bertino, Elisa

    2013-01-01

    This book deals with Private Information Retrieval (PIR), a technique allowing a user to retrieve an element from a server in possession of a database without revealing to the server which element is retrieved. PIR has been widely applied to protect the privacy of the user in querying a service provider on the Internet. For example, by PIR, one can query a location-based service provider about the nearest car park without revealing his location to the server.The first PIR approach was introduced by Chor, Goldreich, Kushilevitz and Sudan in 1995 in a multi-server setting, where the user retriev

  14. Emotional responses as independent components in EEG

    DEFF Research Database (Denmark)

    Jensen, Camilla Birgitte Falk; Petersen, Michael Kai; Larsen, Jakob Eg

    2014-01-01

    susceptible to noise if captured in a mobile context. Hypothesizing that retrieval of emotional responses in mobile usage scenarios could be enhanced through spatial filtering, we compare a standard EEG electrode based analysis against an approach based on independent component analysis (ICA). By clustering...... or unpleasant images; early posterior negativity (EPN) and late positive potential (LPP). Recent studies suggest that several time course components may be modulated by emotional content in images or text. However these neural signatures are characterized by small voltage changes that would be highly...... by emotional content. We propose that similar approaches to spatial filtering might allow us to retrieve more robust signals in real life mobile usage scenarios, and potentially facilitate design of cognitive interfaces that adapt the selection of media to our emotional responses....

  15. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object-o...... be integrated in computer-aided software engineering (CASE) tools for adding formally supported checking, transformation and generation facilities.......Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...

  16. New GPIB Control Software at Jefferson Lab

    International Nuclear Information System (INIS)

    Matthew Bickley; Pavel Chevtsov

    2005-01-01

    The control of GPIB devices at Jefferson Lab is based on the GPIB device/driver library. The library is a part of the device/driver development framework. It is activated with the use of the device configuration files that define all hardware components used in the control system to communicate with GPIB devices. As soon as the software is activated, it is ready to handle any device connected to these components and only needs to know the set of commands that the device can understand. The old GPIB control software at Jefferson Lab requires the definition of these commands in the form of a device control software module written in C for each device. Though such modules are relatively simple, they have to be created, successfully compiled, and supported for all control computer platforms. In the new version of GPIB control software all device communication commands are defined in device protocol (ASCII text) files. This makes the support of GPIB devices in the control system much easier

  17. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  18. Computerized literature reference system: use of an optical scanner and optical character recognition software.

    Science.gov (United States)

    Lossef, S V; Schwartz, L H

    1990-09-01

    A computerized reference system for radiology journal articles was developed by using an IBM-compatible personal computer with a hand-held optical scanner and optical character recognition software. This allows direct entry of scanned text from printed material into word processing or data-base files. Additionally, line diagrams and photographs of radiographs can be incorporated into these files. A text search and retrieval software program enables rapid searching for keywords in scanned documents. The hand scanner and software programs are commercially available, relatively inexpensive, and easily used. This permits construction of a personalized radiology literature file of readily accessible text and images requiring minimal typing or keystroke entry.

  19. Comparison of the effectiveness of alternative feature sets in shape retrieval of multicomponent images

    Science.gov (United States)

    Eakins, John P.; Edwards, Jonathan D.; Riley, K. Jonathan; Rosin, Paul L.

    2001-01-01

    Many different kinds of features have been used as the basis for shape retrieval from image databases. This paper investigates the relative effectiveness of several types of global shape feature, both singly and in combination. The features compared include well-established descriptors such as Fourier coefficients and moment invariants, as well as recently-proposed measures of triangularity and ellipticity. Experiments were conducted within the framework of the ARTISAN shape retrieval system, and retrieval effectiveness assessed on a database of over 10,000 images, using 24 queries and associated ground truth supplied by the UK Patent Office . Our experiments revealed only minor differences in retrieval effectiveness between different measures, suggesting that a wide variety of shape feature combinations can provide adequate discriminating power for effective shape retrieval in multi-component image collections such as trademark registries. Marked differences between measures were observed for some individual queries, suggesting that there could be considerable scope for improving retrieval effectiveness by providing users with an improved framework for searching multi-dimensional feature space.

  20. Analogical reasoning and prefrontal cortex: evidence for separable retrieval and integration mechanisms.

    Science.gov (United States)

    Bunge, Silvia A; Wendelken, Carter; Badre, David; Wagner, Anthony D

    2005-03-01

    The present study examined the contributions of prefrontal cortex (PFC) subregions to two component processes underlying verbal analogical reasoning: semantic retrieval and integration. Event-related functional magnetic resonance imaging data were acquired while subjects performed propositional analogy and semantic decision tasks. On each trial, subjects viewed a pair of words (pair 1), followed by an instructional cue and a second word pair (pair 2). On analogy trials, subjects evaluated whether pair 2 was semantically analogous to pair 1. On semantic trials, subjects indicated whether the pair 2 words were semantically related to each other. Thus, analogy--but not semantic--trials required integration across multiple retrieved relations. To identify regions involved in semantic retrieval, we manipulated the associative strength of pair 1 words in both tasks. Anterior left inferior PFC (aLIPC) was modulated by associative strength, consistent with a role in controlled semantic retrieval. Left frontopolar cortex was insensitive to associative strength, but was more sensitive to integration demands than was aLIPC, consistent with a role in integrating the products of semantic retrieval to evaluate whether distinct representations are analogous. Right dorsolateral PFC exhibited a profile consistent with a role in response selection rather than retrieval or integration. These findings indicate that verbal analogical reasoning depends on multiple, PFC-mediated computations.

  1. EMMA: a new paradigm in configurable software

    International Nuclear Information System (INIS)

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-01-01

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  2. EMMA: a new paradigm in configurable software

    Science.gov (United States)

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-10-01

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  3. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  4. The design of software system of intelligentized γ-camera

    International Nuclear Information System (INIS)

    Zhao Shujun; Li Suxiao; Wang Jing

    2006-01-01

    The software system of γ-camera adopts visualizing and interactive human-computer interface, collecting and displaying the data of patients in real time. Through a series of dealing with the collected data then it put out the medical record in Chinese. This system also can retrieve and backup the data of patients. Besides, it can assist the doctor to diagnose the illness by applying the clinical quantitative analysis function of the system. (authors)

  5. The Effect of Governance on Global Software Development: An Empirical Research in Transactive Memory Systems.

    NARCIS (Netherlands)

    Manteli, C.; van den Hooff, B.J.; van Vliet, J.C.

    2014-01-01

    Context The way global software development (GSD) activities are managed impacts knowledge transactions between team members. The first is captured in governance decisions, and the latter in a transactive memory system (TMS), a shared cognitive system for encoding, storing and retrieving knowledge

  6. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  7. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  8. Software Design Concepts for Archiving and Retrieving Control System Data

    International Nuclear Information System (INIS)

    Christopher Larrieu; Matt Bickley

    2001-01-01

    To develop and operate the control system effectively at the Thomas Jefferson National Accelerator Facility, users require the ability to diagnose its behavior not only in real-time, but also in retrospect. The new Jefferson Lab data logging system provides an acquisition and storage component capable of archiving enough data to provide suitable context for such analyses. In addition, it provides an extraction and presentation-facility which efficiently fulfills requests for both raw and processed data. This paper discusses several technologies and design methodologies which contribute to the system's overall utility. The Application Programming Interface (API) which developers use to access the data derives from a view of the storage system as a specialized relational database. An object-oriented and compartmental design contributes to its portability at several levels, and the use of CORBA facilitates interaction between distributed components in an industry-standard fashion. This work was supported by the U.S. DOE contract No. DE-AC05-84ER40150

  9. Monetary rewards influence retrieval orientations.

    Science.gov (United States)

    Halsband, Teresa M; Ferdinand, Nicola K; Bridger, Emma K; Mecklinger, Axel

    2012-09-01

    Reward anticipation during learning is known to support memory formation, but its role in retrieval processes is so far unclear. Retrieval orientations, as a reflection of controlled retrieval processing, are one aspect of retrieval that might be modulated by reward. These processes can be measured using the event-related potentials (ERPs) elicited by retrieval cues from tasks with different retrieval requirements, such as via changes in the class of targeted memory information. To determine whether retrieval orientations of this kind are modulated by reward during learning, we investigated the effects of high and low reward expectancy on the ERP correlates of retrieval orientation in two separate experiments. The reward manipulation at study in Experiment 1 was associated with later memory performance, whereas in Experiment 2, reward was directly linked to accuracy in the study task. In both studies, the participants encoded mixed lists of pictures and words preceded by high- or low-reward cues. After 24 h, they performed a recognition memory exclusion task, with words as the test items. In addition to a previously reported material-specific effect of retrieval orientation, a frontally distributed, reward-associated retrieval orientation effect was found in both experiments. These findings suggest that reward motivation during learning leads to the adoption of a reward-associated retrieval orientation to support the retrieval of highly motivational information. Thus, ERP retrieval orientation effects not only reflect retrieval processes related to the sought-for materials, but also relate to the reward conditions with which items were combined during encoding.

  10. Practical issues of retrieving isolated attosecond pulses

    International Nuclear Information System (INIS)

    Wang He; Chini, Michael; Khan, Sabih D; Chen, Shouyuan; Gilbertson, Steve; Feng Ximao; Mashiko, Hiroki; Chang Zenghu

    2009-01-01

    The attosecond streaking technique is used for the characterization of isolated extreme ultraviolet (XUV) attosecond pulses. This type of measurement suffers from low photoelectron counts in the streaked spectrogram, and is thus susceptible to shot noise. For the retrieval of few- or mono-cycle attosecond pulses, high-intensity streaking laser fields are required, which cause the energy spectrum of above-threshold ionized (ATI) electrons to overlap with that of the streaked photoelectrons. It is found by using the principal component generalized projections algorithm that the XUV attosecond pulse can accurately be retrieved for simulated and experimental spectrograms with a peak value of 50 or more photoelectron counts. Also, the minimum streaking intensity is found to be more than 50 times smaller than that required by the classical streaking camera for retrieval of pulses with a spectral bandwidth supporting 90 as transform-limited pulse durations. Furthermore, spatial variation of the streaking laser intensity, collection angle of streaked electrons and time delay jitter between the XUV pulse and streaking field can degrade the quality of the streaked spectrogram. We find that even when the XUV and streaking laser focal spots are comparable in size, the streaking electrons are collected from a 4π solid angle, or the delay fluctuates by more than the attosecond pulse duration, the attosecond pulses can still be accurately retrieved. In order to explain the insusceptibility of the streaked spectrogram to these factors, the linearity of the streaked spectrogram with respect to the streaking field is derived under the saddle point approximation.

  11. Formal synthesis of application and platform behaviors of embedded software systems

    DEFF Research Database (Denmark)

    Kim, Jin Hyun; Kang, Inhye; Choi, Jin-Young

    2015-01-01

    Two main embedded software components, application software and platform software, i.e., the real-time operating system (RTOS), interact with each other in order to achieve the functionality of the system. However, they are so different in behaviors that one behavior modeling language is not suff......Two main embedded software components, application software and platform software, i.e., the real-time operating system (RTOS), interact with each other in order to achieve the functionality of the system. However, they are so different in behaviors that one behavior modeling language...... is not sufficient to model both styles of behaviors and to reason about the characteristics of their individual behaviors as well as their parallel behavior and interaction properties. In this paper, we present a formal approach to the synthesis of the application software and the RTOS behavior models...

  12. Prioritizing the refactoring need for critical component using combined approach

    Directory of Open Access Journals (Sweden)

    Rajni Sehgal

    2018-10-01

    Full Text Available One of the most promising strategies that will smooth out the maintainability issues of the software is refactoring. Due to lack of proper design approach, the code often inherits some bad smells which may lead to improper functioning of the code, especially when it is subject to change and requires some maintenance. A lot of studies have been performed to optimize the refactoring strategy which is also a very expensive process. In this paper, a component based system is considered, and a Fuzzy Multi Criteria Decision Making (FMCDM model is proposed by combining subjective and objective weights to rank the components as per their urgency of refactoring. Jdeodorant tool is used to detect the code smells from the individual components of a software system. The objective method uses the Entropy approach to rank the component having the code smell. The subjective method uses the Fuzzy TOPSIS approach based on decision makers’ judgement, to identify the critically and dependency of these code smells on the overall software. The suggested approach is implemented on component-based software having 15 components. The constitute components are ranked based on refactoring requirements.

  13. Using TinyOS Components for the Design of an Adaptive Ubiquitous System

    NARCIS (Netherlands)

    Kaya, O.S.; Durmaz, O.; Dulman, S.O.; Gemesi, R.; Jansen, P.G.; Havinga, Paul J.M.

    2005-01-01

    This work is an initiative attempt toward component-based software engineering in ubiquitous computing systems. Software components cooperate in a distributed manner to meet a demand, and adapt their software bindings during run-time depending on the context information. There are two main research

  14. Using TinyOS Components for the Design of an Adaptive Ubiquitous System

    NARCIS (Netherlands)

    Kaya, O.S.; Durmaz, O.; Dulman, S.O.; Gemesi, R.; Jansen, P.G.; Havinga, Paul J.M.

    This work is an initiative attempt toward component-based software engineering in ubiquitous computing systems. Software components cooperate in a distributed manner to meet a demand, and adapt their software bindings during run-time depending on the context information. There are two main research

  15. Retrieval-practice task affects relationship between working memory capacity and retrieval-induced forgetting.

    Science.gov (United States)

    Storm, Benjamin C; Bui, Dung C

    2016-11-01

    Retrieving a subset of items from memory can cause forgetting of other items in memory, a phenomenon referred to as retrieval-induced forgetting (RIF). Individuals who exhibit greater amounts of RIF have been shown to also exhibit superior working memory capacity (WMC) and faster stop-signal reaction times (SSRTs), results which have been interpreted as suggesting that RIF reflects an inhibitory process that is mediated by the processes of executive control. Across four experiments, we sought to further elucidate this issue by manipulating the way in which participants retrieved items during retrieval practice and examining how the resulting effects of forgetting correlated with WMC (Experiments 1-3) and SSRT (Experiment 4). Significant correlations were observed when participants retrieved items from an earlier study phase (within-list retrieval practice), but not when participants generated items from semantic memory (extra-list retrieval practice). These results provide important new insight into the role of executive-control processes in RIF.

  16. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Optimal structure of fault-tolerant software systems

    International Nuclear Information System (INIS)

    Levitin, Gregory

    2005-01-01

    This paper considers software systems consisting of fault-tolerant components. These components are built from functionally equivalent but independently developed versions characterized by different reliability and execution time. Because of hardware resource constraints, the number of versions that can run simultaneously is limited. The expected system execution time and its reliability (defined as probability of obtaining the correct output within a specified time) strictly depend on parameters of software versions and sequence of their execution. The system structure optimization problem is formulated in which one has to choose software versions for each component and find the sequence of their execution in order to achieve the greatest system reliability subject to cost constraints. The versions are to be chosen from a list of available products. Each version is characterized by its reliability, execution time and cost. The suggested optimization procedure is based on an algorithm for determining system execution time distribution that uses the moment generating function approach and on the genetic algorithm. Both N-version programming and the recovery block scheme are considered within a universal model. Illustrated example is presented

  18. In-field inspection support software: A status report on the Common Inspection On-site Software Package (CIOSP) project

    International Nuclear Information System (INIS)

    Novatchev, Dimitre; Titov, Pavel; Siradjov, Bakhtiiar; Vlad, Ioan; Xiao Jing

    2001-01-01

    Full text: IAEA has invested much thought and effort into developing software that can assist inspectors during their inspection work. Experience with such applications has been steadily growing and IAEA has recently commissioned a next-generation software package. This kind of software accommodates inspection tasks that can vary substantially in function depending on the type of installation being inspected as well as ensures that the resulting software package has a wide range of usability and can preclude excessive development of plant-specific applications. The Common Inspection On-site Software Package is being developed in the Department of Safeguards to address the limitations of the existing software and to expand its coverage of the inspection process. CIOSP is 'common' in that it is aimed at providing support for as many facilities as possible with the minimum re-configuration. At the same time it has to cater to varying needs of individual facilities, different instrumentation and verification methods used. A component-based approach was taken to successfully tackle the challenges that the development of this software presented. CIOSP consists of the following major components: A framework into which individual plug-ins supporting various inspection activities can integrate at run-time; A central data store containing all facility configuration data and all data collected during inspections; A local data store, which resides on the inspector's computer, where the current inspection's data is stored; A set of services used by all plug-ins (i.e. data transformation, authentication, replication services etc.). This architecture allows for incremental development and extension of the software with plug-ins that support individual inspection activities. The core set of components along with the framework, the Inventory Verification, Book Examination and Records and Reports Comparison plug-ins have been developed. The development of the Short Notice Random

  19. Agile Software Development in Defense Acquisition: A Mission Assurance Perspective

    Science.gov (United States)

    2012-03-23

    based information retrieval system, we might say that this program works like a hive of bees , going out for pollen and bringing it back to the hive...developers ® Six Siqma is reqistered in the U. S. Patent and Trademark Office by Motorola ^_ 33 @ AEROSPACE Major Areas in a Typical Software...requirements - Capturing and evaluating quality metrics, identifying common problem areas **» Despite its positive impact on quality, pair programming

  20. High-level Component Interfaces for Collaborative Development: A Proposal

    Directory of Open Access Journals (Sweden)

    Thomas Marlowe

    2009-12-01

    Full Text Available Software development has rapidly moved toward collaborative development models where multiple partners collaborate in creating and evolving software intensive systems or components of sophisticated ubiquitous socio-technical-ecosystems. In this paper we extend the concept of software interface to a flexible high-level interface as means for accommodating change and localizing, controlling and managing the exchange of knowledge and functional, behavioral, quality, project and business related information between the partners and between the developed components.

  1. Development of internet-based information systems using software components with the emphasis on the application in the military organization

    Directory of Open Access Journals (Sweden)

    Miloš J. Pejanović

    2011-01-01

    Full Text Available The development of personal computers and Internet technology causes continuous changes in methodological approaches and concepts of development of information systems. Most existing information systems, due to their heterogeneity, have a problem of integration of subsystems. In order to overcome this problem, software vendors offer different solutions. In this work we explore different approaches and propose an optimal way, with a special emphasis on its application in the military organization. By applying modern approaches in the development of information systems on the concept of distributed component systems, we come to the set of proposed solutions from different manufacturers. The solutions are related to the mechanisms which should ensure that components written in different languages cooperate with each other in heterogeneous systems that are in different nodes in the computer network. This work describes the concept of component distributed information systems of Internet technology and their capabilities and offers a solution specifying the implementation environment in the military organization. Access to the development of information systems In the development of information systems, an important role is given to the choice of appropriate methods and tools. For large systems such as military organizations, standardized procedures and methodologies for the development of information systems are recommended. There are different methodological approaches in the development of information systems: a systematic integrated approach to development (from design, implementation to implementation and maintenance and development of information systems as technical - technological structures (standard computer and network service. The combination of these two approaches leads to the concept of 'open systems' that allow different standards and IT services to operate on these systems. The UML system description of the process of software

  2. Retrieving simulated volcanic, desert dust and sea-salt particle properties from two/three-component particle mixtures using UV-VIS polarization lidar and T matrix

    Directory of Open Access Journals (Sweden)

    G. David

    2013-07-01

    Full Text Available During transport by advection, atmospheric nonspherical particles, such as volcanic ash, desert dust or sea-salt particles experience several chemical and physical processes, leading to a complex vertical atmospheric layering at remote sites where intrusion episodes occur. In this paper, a new methodology is proposed to analyse this complex vertical layering in the case of a two/three-component particle external mixtures. This methodology relies on an analysis of the spectral and polarization properties of the light backscattered by atmospheric particles. It is based on combining a sensitive and accurate UV-VIS polarization lidar experiment with T-matrix numerical simulations and air mass back trajectories. The Lyon UV-VIS polarization lidar is used to efficiently partition the particle mixture into its nonspherical components, while the T-matrix method is used for simulating the backscattering and depolarization properties of nonspherical volcanic ash, desert dust and sea-salt particles. It is shown that the particle mixtures' depolarization ratio δ p differs from the nonspherical particles' depolarization ratio δns due to the presence of spherical particles in the mixture. Hence, after identifying a tracer for nonspherical particles, particle backscattering coefficients specific to each nonspherical component can be retrieved in a two-component external mixture. For three-component mixtures, the spectral properties of light must in addition be exploited by using a dual-wavelength polarization lidar. Hence, for the first time, in a three-component external mixture, the nonsphericity of each particle is taken into account in a so-called 2β + 2δ formalism. Applications of this new methodology are then demonstrated in two case studies carried out in Lyon, France, related to the mixing of Eyjafjallajökull volcanic ash with sulfate particles (case of a two-component mixture and to the mixing of dust with sea-salt and water-soluble particles

  3. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    Science.gov (United States)

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  4. Cost estimation in software engineering projects with web components development

    Directory of Open Access Journals (Sweden)

    Javier de Andrés

    2015-01-01

    Full Text Available Existen multitud de modelos propuestos para la predicción de co stes en proyectos de software, al gunos orientados específicamen te para proyectos Web. Este trabajo analiza si los modelos específicos para proyectos Web están justifi cados, examinando el comportami ento diferencial de los costes entre proyectos de desarrollo softwar e Web y no Web. Se analizan dos aspectos del cálculo de costes: las deseconomías de escala, y el im pacto de algunas características de estos proyectos que son utilizadas como cost drivers. Se en uncian dos hipótesis: (a en estos proyect os las deseconomías de escala so n mayores y (b el incremento de coste que provocan los cost dr ivers es menor para los proyectos Web. Se contrastaron estas hipótesis a nalizando un conjunto de proyectos reales. Los resultados sugie ren que ambas hipótesis se cumplen. Por lo tanto, la principal contribu ción a la literatura de esta inv estigación es que el desarrollo de modelos específicos para los proyectos Web está justificado.

  5. Why Free Software Matters for Literacy Educators.

    Science.gov (United States)

    Brunelle, Michael D.; Bruce, Bertram C.

    2002-01-01

    Notes that understanding what "free software" means and its implications for access and use of new technologies is an important component of the new literacies. Concludes that if free speech and free press are essential to the development of a general literacy, then free software can promote the development of computer literacy. (SG)

  6. Can We Retrieve the Information Which Was Intentionally Forgotten? Electrophysiological Correlates of Strategic Retrieval in Directed Forgetting

    Directory of Open Access Journals (Sweden)

    Xinrui Mao

    2017-08-01

    Full Text Available Retrieval inhibition hypothesis of directed forgetting effects assumed TBF (to-be-forgotten items were not retrieved intentionally, while selective rehearsal hypothesis assumed the memory representation of retrieved TBF (to-be-forgotten items was weaker than TBR (to-be-remembered items. Previous studies indicated that directed forgetting effects of item-cueing method resulted from selective rehearsal at encoding, but the mechanism of retrieval inhibition that affected directed forgetting of TBF (to-be-forgotten items was not clear. Strategic retrieval is a control process allowing the selective retrieval of target information, which includes retrieval orientation and strategic recollection. Retrieval orientation via the comparison of tasks refers to the specific form of processing resulted by retrieval efforts. Strategic recollection is the type of strategies to recollect studied items for the retrieval success of targets. Using a “directed forgetting” paradigm combined with a memory exclusion task, our investigation of strategic retrieval in directed forgetting assisted to explore how retrieval inhibition played a role on directed forgetting effects. When TBF items were targeted, retrieval orientation showed more positive ERPs to new items, indicating that TBF items demanded more retrieval efforts. The results of strategic recollection indicated that: (a when TBR items were retrieval targets, late parietal old/new effects were only evoked by TBR items but not TBF items, indicating the retrieval inhibition of TBF items; (b when TBF items were retrieval targets, the late parietal old/new effect were evoked by both TBR items and TBF items, indicating that strategic retrieval could overcome retrieval inhibition of TBF items. These findings suggested the modulation of strategic retrieval on retrieval inhibition of directed forgetting, supporting that directed forgetting effects were not only caused by selective rehearsal, but also retrieval

  7. A Conceptual Framework for Lean Regulated Software Development

    DEFF Research Database (Denmark)

    Cawley, Oisin; Richardson, Ita; Wang, Xiaofeng

    2015-01-01

    for software development within a regulated environment? This poster presents the results of our empirical research into lean and regulated software development. Built from a combination of data sources, we have developed a conceptual framework comprising five primary components. In addition the relationships...... they have with both the central focus of the framework (the situated software development practices) and with each other are indicated....

  8. MEGACELL: A nanocrystal model construction software for HRTEM multislice simulation

    International Nuclear Information System (INIS)

    Stroppa, Daniel G.; Righetto, Ricardo D.; Montoro, Luciano A.; Ramirez, Antonio J.

    2011-01-01

    Image simulation has an invaluable importance for the accurate analysis of High Resolution Transmission Electron Microscope (HRTEM) results, especially due to its non-linear image formation mechanism. Because the as-obtained images cannot be interpreted in a straightforward fashion, the retrieval of both qualitative and quantitative information from HRTEM micrographs requires an iterative process including the simulation of a nanocrystal model and its comparison with experimental images. However most of the available image simulation software requires atom-by-atom coordinates as input for the calculations, which can be prohibitive for large finite crystals and/or low-symmetry systems and zone axis orientations. This paper presents an open source citation-ware tool named MEGACELL, which was developed to assist on the construction of nanocrystals models. It allows the user to build nanocrystals with virtually any convex polyhedral geometry and to retrieve its atomic positions either as a plain text file or as an output compatible with EMS (Electron Microscopy Software) input protocol. In addition to the description of this tool features, some construction examples and its application for scientific studies are presented. These studies show MEGACELL as a handy tool, which allows an easier construction of complex nanocrystal models and improves the quantitative information extraction from HRTEM images. -- Highlights: → A software to support the HRTEM image simulation of nanocrystals in actual size. → MEGACELL allows the construction of complex nanocrystals models for multislice image simulation. → Some examples of improved nanocrystalline system characterization are presented, including the analysis of 3D morphology and growth behavior.

  9. Survey package: Technical and contracting strategies for single-shell tank waste retrieval on the Hanford Site

    International Nuclear Information System (INIS)

    Ramsower, D.C.

    1995-01-01

    Westinghouse Hanford Company is interested in innovative, commercially available or adaptable retrieval system equipment, concepts, and contracting strategies that will ad to existing Hanford Site technology and significantly reduce cost and/or risk from the baseline retrieval approach of sluicing (hydraulically mining) the waste from the SSTs onsite. The objective of this request is to gather information from industry to identify and summarize a suite of retrieval-related components, systems, and contracting approaches. This information will be used to ensure that WHC understands the various waste retrieval alternative approaches, their risks, and their application on the Hanford Site tanks for those occasions when sluicing is not sufficiently effective, appropriate, or cost-effective. An additional objective is to facilitate industry's understanding of the tank and site interface requirements for SST waste retrieval and the complex statutory, legal, regulatory, labor, and other institutional standards being applied to the Hanford Site. This effort will identify and summarize retrieval solutions by the end of September 1996 so that a clear basis for future retrieval program decisions can be established

  10. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  11. Co-sourcing in software development offshoring

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2013-01-01

    Software development projects are increasingly geographical distributed with offshoring, which introduce complex risks that can lead to project failure. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how co-sour......-taking by high attention to of the closely interrelated structure and technology components in terms of CMMI and the actors’ cohesion and integration in terms of Scrum....

  12. Neural Mechanisms of Episodic Retrieval Support Divergent Creative Thinking.

    Science.gov (United States)

    Madore, Kevin P; Thakral, Preston P; Beaty, Roger E; Addis, Donna Rose; Schacter, Daniel L

    2017-11-17

    Prior research has indicated that brain regions and networks that support semantic memory, top-down and bottom-up attention, and cognitive control are all involved in divergent creative thinking. Kernels of evidence suggest that neural processes supporting episodic memory-the retrieval of particular elements of prior experiences-may also be involved in divergent thinking, but such processes have typically been characterized as not very relevant for, or even a hindrance to, creative output. In the present study, we combine functional magnetic resonance imaging with an experimental manipulation to test formally, for the first time, episodic memory's involvement in divergent thinking. Following a manipulation that facilitates detailed episodic retrieval, we observed greater neural activity in the hippocampus and stronger connectivity between a core brain network linked to episodic processing and a frontoparietal brain network linked to cognitive control during divergent thinking relative to an object association control task that requires little divergent thinking. Stronger coupling following the retrieval manipulation extended to a subsequent resting-state scan. Neural effects of the episodic manipulation were consistent with behavioral effects of enhanced idea production on divergent thinking but not object association. The results indicate that conceptual frameworks should accommodate the idea that episodic retrieval can function as a component process of creative idea generation, and highlight how the brain flexibly utilizes the retrieval of episodic details for tasks beyond simple remembering. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Connectionist Interaction Information Retrieval.

    Science.gov (United States)

    Dominich, Sandor

    2003-01-01

    Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…

  14. Noncompetitive retrieval practice causes retrieval-induced forgetting in cued recall but not in recognition.

    Science.gov (United States)

    Grundgeiger, Tobias

    2014-04-01

    Retrieving a subset of learned items can lead to the forgetting of related items. Such retrieval-induced forgetting (RIF) can be explained by the inhibition of irrelevant items in order to overcome retrieval competition when the target item is retrieved. According to the retrieval inhibition account, such retrieval competition is a necessary condition for RIF. However, research has indicated that noncompetitive retrieval practice can also cause RIF by strengthening cue-item associations. According to the strength-dependent competition account, the strengthened items interfere with the retrieval of weaker items, resulting in impaired recall of weaker items in the final memory test. The aim of this study was to replicate RIF caused by noncompetitive retrieval practice and to determine whether this forgetting is also observed in recognition tests. In the context of RIF, it has been assumed that recognition tests circumvent interference and, therefore, should not be sensitive to forgetting due to strength-dependent competition. However, this has not been empirically tested, and it has been suggested that participants may reinstate learned cues as retrieval aids during the final test. In the present experiments, competitive practice or noncompetitive practice was followed by either final cued-recall tests or recognition tests. In cued-recall tests, RIF was observed in both competitive and noncompetitive conditions. However, in recognition tests, RIF was observed only in the competitive condition and was absent in the noncompetitive condition. The result underscores the contribution of strength-dependent competition to RIF. However, recognition tests seem to be a reliable way of distinguishing between RIF due to retrieval inhibition or strength-dependent competition.

  15. The Software Architecture of Global Climate Models

    Science.gov (United States)

    Alexander, K. A.; Easterbrook, S. M.

    2011-12-01

    It has become common to compare and contrast the output of multiple global climate models (GCMs), such as in the Climate Model Intercomparison Project Phase 5 (CMIP5). However, intercomparisons of the software architecture of GCMs are almost nonexistent. In this qualitative study of seven GCMs from Canada, the United States, and Europe, we attempt to fill this gap in research. We describe the various representations of the climate system as computer programs, and account for architectural differences between models. Most GCMs now practice component-based software engineering, where Earth system components (such as the atmosphere or land surface) are present as highly encapsulated sub-models. This architecture facilitates a mix-and-match approach to climate modelling that allows for convenient sharing of model components between institutions, but it also leads to difficulty when choosing where to draw the lines between systems that are not encapsulated in the real world, such as sea ice. We also examine different styles of couplers in GCMs, which manage interaction and data flow between components. Finally, we pay particular attention to the varying levels of complexity in GCMs, both between and within models. Many GCMs have some components that are significantly more complex than others, a phenomenon which can be explained by the respective institution's research goals as well as the origin of the model components. In conclusion, although some features of software architecture have been adopted by every GCM we examined, other features show a wide range of different design choices and strategies. These architectural differences may provide new insights into variability and spread between models.

  16. [Competition-dependence on retrieval-induced forgetting: the influence of the amount of retrieval cues].

    Science.gov (United States)

    Yamada, Yohei; Tsukimoto, Takashi; Hirano, Tetsuji

    2010-02-01

    Remembering some of the studied (target) items impairs subsequent remembrance of relevant (non-target) items. This phenomenon, retrieval-induced forgetting (RIF), occurs when non-targets actively compete with the retrieval of a target. Researchers suggest that suppression mechanisms reduce interference from relevant items to facilitate the retrieval of target items (Anderson, 2003). Competition-dependence is one of the properties that support the suppression hypothesis (Anderson, Bjork, & Bjork, 1994). In the present study, we manipulated the type of retrieval practice (normal, last-letter, or category-name) in order to vary the degree of competition between the target and the non-targets. For the high-scoring retrieval practice group, RIF occurred in the normal retrieval condition, but not in the last-letter or in the category-name conditions. For the low-scoring retrieval practice group, RIF did not occur in any of the conditions. These findings provide new evidence that the occurrence of RIF depends on the degree of competition between a target item and related non-target items during retrieval practice.

  17. SDI Software Technology Program Plan Version 1.5

    Science.gov (United States)

    1987-06-01

    Display Generator ( SDG ) [Patterson 83] SDG supports the creation, display, modification, storage, and retrieval of components of simulation models via... Compass 󈨚, Washington D.C. July 7-11, 1986. [Parnas 85] Parnas, D.L., and Weiss, D.M., Active Design Reviews: Principles and Practices, NRL Report

  18. Retrieval of Electron Density Profile for KOMPSAT-5 GPS Radio Occultation

    Directory of Open Access Journals (Sweden)

    Woo-Kyoung Lee

    2007-12-01

    Full Text Available The AOPOD (Atmosphere Occultation and Precision Orbit Determination system, the secondary payload of KOMPSAT (KOrea Multi-Purpose SATellite-5 scheduled to be launched in 2010, shall provide GPS radio occultation data. In this paper, we simulated the GPS radio occultation characteristic of KOMPSAT-5 and retrieved electron density profiles using KROPS (KASI Radio Occultation Processing Software. The electron density retrieved from CHAMP (CHAllenging Minisatellite Payload GPS radio occultation data on June 20, 2004 was compared with IRI (International Reference Ionosphere - 2001, PLP (Planar Langmuir Probe, and ionosonde measurements. When the result was compared with ionosonde measurements, the discrepancies were 5 km on the F_2 peak height (hmF_2 and 3×10^{10} el/m^3 on the electron density of the F_2 peak height (NmF_2. By comparing with the Langmuir Probe measurements of CHAMP satellite (PLP, both agrees with 1.6×10^{11} el/m^3 at the height of 365.6 km.

  19. SOFTWARE IN TOURISM INDUSTRY : A Study On Emerging New Niches Of Software In Hotel Industry

    OpenAIRE

    Regmi, Krishna Kumar; Thapa, Bikesh

    2010-01-01

    This study was structured as a part of Bachelor Degree thesis in Tourism Degree Programme in Laurea University of Applied Sciences. The study examines the role of software as a major component of ICTs (Information and Communication Technologies) in hotel industry in Finland. The study was conducted in three major hotel chains in Finland in order to identify the scope and possibility of developing new software module within the periphery of contemporary Property Management Systems (PMS). Marke...

  20. Translator for Optimizing Fluid-Handling Components

    Science.gov (United States)

    Landon, Mark; Perry, Ernest

    2007-01-01

    A software interface has been devised to facilitate optimization of the shapes of valves, elbows, fittings, and other components used to handle fluids under extreme conditions. This software interface translates data files generated by PLOT3D (a NASA grid-based plotting-and- data-display program) and by computational fluid dynamics (CFD) software into a format in which the files can be read by Sculptor, which is a shape-deformation- and-optimization program. Sculptor enables the user to interactively, smoothly, and arbitrarily deform the surfaces and volumes in two- and three-dimensional CFD models. Sculptor also includes design-optimization algorithms that can be used in conjunction with the arbitrary-shape-deformation components to perform automatic shape optimization. In the optimization process, the output of the CFD software is used as feedback while the optimizer strives to satisfy design criteria that could include, for example, improved values of pressure loss, velocity, flow quality, mass flow, etc.

  1. Creating and Testing Simulation Software

    Science.gov (United States)

    Heinich, Christina M.

    2013-01-01

    The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.

  2. Retrieving self-vocalized information: An event-related potential (ERP) study on the effect of retrieval orientation.

    Science.gov (United States)

    Rosburg, Timm; Johansson, Mikael; Sprondel, Volker; Mecklinger, Axel

    2014-11-18

    Retrieval orientation refers to a pre-retrieval process and conceptualizes the specific form of processing that is applied to a retrieval cue. In the current event-related potential (ERP) study, we sought to find evidence for an involvement of the auditory cortex when subjects attempt to retrieve vocalized information, and hypothesized that adopting retrieval orientation would be beneficial for retrieval accuracy. During study, participants saw object words that they subsequently vocalized or visually imagined. At test, participants had to identify object names of one study condition as targets and to reject object names of the second condition together with new items. Target category switched after half of the test trials. Behaviorally, participants responded less accurately and more slowly to targets of the vocalize condition than to targets of the imagine condition. ERPs to new items varied at a single left electrode (T7) between 500 and 800ms, indicating a moderate retrieval orientation effect in the subject group as a whole. However, whereas the effect was strongly pronounced in participants with high retrieval accuracy, it was absent in participants with low retrieval accuracy. A current source density (CSD) mapping of the retrieval orientation effect indicated a source over left temporal regions. Independently from retrieval accuracy, the ERP retrieval orientation effect was surprisingly also modulated by test order. Findings are suggestive for an involvement of the auditory cortex in retrieval attempts of vocalized information and confirm that adopting retrieval orientation is potentially beneficial for retrieval accuracy. The effects of test order on retrieval-related processes might reflect a stronger focus on the newness of items in the more difficult test condition when participants started with this condition. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Harnessing user generated multimedia content in the creation of collaborative classification structures and retrieval learning games

    Science.gov (United States)

    Borchert, Otto Jerome

    This paper describes a software tool to assist groups of people in the classification and identification of real world objects called the Classification, Identification, and Retrieval-based Collaborative Learning Environment (CIRCLE). A thorough literature review identified current pedagogical theories that were synthesized into a series of five tasks: gathering, elaboration, classification, identification, and reinforcement through game play. This approach is detailed as part of an included peer reviewed paper. Motivation is increased through the use of formative and summative gamification; getting points completing important portions of the tasks and playing retrieval learning based games, respectively, which is also included as a peer-reviewed conference proceedings paper. Collaboration is integrated into the experience through specific tasks and communication mediums. Implementation focused on a REST-based client-server architecture. The client is a series of web-based interfaces to complete each of the tasks, support formal classroom interaction through faculty accounts and student tracking, and a module for peers to help each other. The server, developed using an in-house JavaMOO platform, stores relevant project data and serves data through a series of messages implemented as a JavaScript Object Notation Application Programming Interface (JSON API). Through a series of two beta tests and two experiments, it was discovered the second, elaboration, task requires considerable support. While students were able to properly suggest experiments and make observations, the subtask involving cleaning the data for use in CIRCLE required extra support. When supplied with more structured data, students were enthusiastic about the classification and identification tasks, showing marked improvement in usability scores and in open ended survey responses. CIRCLE tracks a variety of educationally relevant variables, facilitating support for instructors and researchers. Future

  4. Safety equipment list for 241-C-106 waste retrieval, Project W-320: Revision 1

    International Nuclear Information System (INIS)

    Conner, J.C.

    1994-01-01

    The goals of the C-106 sluicing operation are: (1) to stabilize the tank by reducing the heat load in the tank to less than 42 MJ/hr (40,000 Btu/hour), and (2) to initiate demonstration of single-shell tank (SST) retrieval technology. The purpose of this supporting document (SD) is as follows: (1) to provide safety classifications for items (systems, structures, equipment, components, or parts) for the waste retrieval sluicing system (WRSS), and (2) to document and methodology used to develop safety classifications. Appropriate references are made with regard to use of existing systems, structures, equipments, components, and parts for C-106 single-shell transfer tank located in the C Tank Farm, and 241-AY-102 (AY-102) double shell receiver tanks (DST) located in the Aging Waste Facility (AWF). The Waste Retrieval Sluicing System consists of two transfer lines that would connect the two tanks, one to carry the sluiced waste slurry to AY-102, and the other to return the supernatant liquid to C-106. The supernatant, or alternate fluid, will be used to mobilize waste in C-106 for the sluicing process. The equipment necessary for the WRSS include pumps in each tank, sluicers to direct the supernatant stream in C-106, a slurry distributor in AY-102, HVAC for C-106, instrumentation and control devices, and other existing components as required

  5. Beyond information retrieval: information discovery and multimedia information retrieval

    OpenAIRE

    Roberto Raieli

    2017-01-01

    The paper compares the current methodologies for search and discovery of information and information resources: terminological search and term-based language, own of information retrieval (IR); semantic search and information discovery, being developed mainly through the language of linked data; semiotic search and content-based language, experienced by multimedia information retrieval (MIR).MIR semiotic methodology is, then, detailed.

  6. Development of a smart-antenna test-bed, demonstrating software defined digital beamforming

    NARCIS (Netherlands)

    Kluwer, T.; Slump, Cornelis H.; Schiphorst, Roelof; Hoeksema, F.W.

    2001-01-01

    This paper describes a smart-antenna test-bed consisting of ‘common of the shelf’ (COTS) hardware and software defined radio components. The use of software radio components enables a flexible platform to implement and test mobile communication systems as a real-world system. The test-bed is

  7. Geological Disposal of Radioactive Waste: Technological Implications for Retrievability

    International Nuclear Information System (INIS)

    2009-01-01

    Various IAEA Member States are discussing whether and to what degree reversibility (including retrievability) might be built into management strategies for radioactive waste. This is particularly the case in relation to the disposal of long lived and/or high level waste and spent nuclear fuel (SNF) in geological repositories. It is generally accepted that such repositories should be designed to be passively safe with no intention of retrieving the waste. Nevertheless, various reasons have been advanced for including the concept of reversibility and the ability to retrieve the emplaced wastes in the disposal strategy. The intention is to increase the level of flexibility and to provide the ability to cope with, or to benefit from, new technical advances in waste management and materials technologies, and to respond to changing social, economic and political opinion. The technological implications of retrievability in geological disposal concepts are explored in this report. Scenarios for retrieving emplaced waste packages are considered and the report aims to identify and describe any related technological provisions that should be incorporated into the design, construction, operational and closure phases of the repository. This is based on a number of reference concepts for the geological disposal of radioactive waste (including SNF) which are currently being developed in Member States with advanced development programmes. The report begins with a brief overview of various repository concepts, starting with a summary of the types of radioactive waste that are typically considered for deep geological disposal. The main host rocks considered are igneous crystalline and volcanic rocks, argillaceous clay rocks and salts. The typical design features of repositories are provided with a description of repository layouts, an overview of the key features of the major repository components, comprising the waste package, the emplacement cells and repository access facilities

  8. Component Composability Issues in Object-Oriented Programming

    NARCIS (Netherlands)

    Aksit, Mehmet; Tekinerdogan, B.

    1997-01-01

    Building software from reusable components is considered important in reducing development costs. Object-oriented languages such as C++, Smalltalk and Java, however, are not capable of expressing certain aspects of applications in a composable way. Software engineers may experience difficulties in

  9. Year 2000 compliance concerns with the ISA Thermoluminescent Dosimetry Data Processing (TL-DP) software system

    Energy Technology Data Exchange (ETDEWEB)

    Saviz, K.

    1998-05-26

    The year 2000 is rapidly approaching, and there is a good chance that computer systems that utilize two digit year dates will experience problems in retrieval of date information. The ISA Thermoluminescent Dosimetry Data Processing (TL-DP) software and computer system has been reviewed for Year 2000 compliance issues.

  10. Year 2000 compliance concerns with the ISA Thermoluminescent Dosimetry Data Processing (TL-DP) software system

    International Nuclear Information System (INIS)

    Saviz, K.

    1998-01-01

    The year 2000 is rapidly approaching, and there is a good chance that computer systems that utilize two digit year dates will experience problems in retrieval of date information. The ISA Thermoluminescent Dosimetry Data Processing (TL-DP) software and computer system has been reviewed for Year 2000 compliance issues

  11. Next-generation business intelligence software with Silverlight 3

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  12. The Use of Utility Accounting Software at Miami University.

    Science.gov (United States)

    Wenner, Paul

    1999-01-01

    Describes how Miami University successfully developed an accounting software package that tracked and recorded their utility usage, including examples of its graphics and reporting components. Background information examining the decision to pursue an energy management software package is included. (GR)

  13. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  14. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A

    2013-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  15. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  16. Data storage and retrieval system abstract

    Science.gov (United States)

    Matheson, Barbara

    1992-09-01

    The STX mass storage system design is intended for environments requiring high speed access to large volumes of data (terabyte and greater). Prior to commitment to a product design plan, STX conducted an exhaustive study of the commercially available off-the-shelf hardware and software. STX also conducted research into the area of emerging technologies in networks and storage media so that the design could easily accommodate new interfaces and peripherals as they came on the market. All the selected system elements were brought together in a demo suite sponsored jointly by STX and ALLIANT where the system elements were evaluated based on actual operation using a client-server mirror image configuration. Testing was conducted to assess the various component overheads and results were compared against vendor data claims. The resultant system, while adequate to meet our capacity requirements, fell short of transfer speed expectations. A product team lead by STX was assembled and chartered with solving the bottleneck issues. Optimization efforts yielded a 60 percent improvement in throughput performance. The ALLIANT computer platform provided the I/O flexibility needed to accommodate a multitude of peripheral interfaces including the following: up to twelve 25MB/s VME I/O channels; up to five HiPPI I/O full duplex channels; IPI-s, SCSI, SMD, and RAID disk array support; standard networking software support for TCP/IP, NFS, and FTP; open architecture based on standard RISC processors; and V.4/POSIX-based operating system (Concentrix). All components including the software are modular in design and can be reconfigured as needs and system uses change. Users can begin with a small system and add modules as needed in the field. Most add-ons can be accomplished seamlessly without revision, recompilation or re-linking of software.

  17. Simple solution to the medical instrumentation software problem

    Science.gov (United States)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  18. Combining Passive Microwave Sounders with CYGNSS information for improved retrievals: Observations during Hurricane Harvey

    Science.gov (United States)

    Schreier, M. M.

    2017-12-01

    The launch of CYGNSS (Cyclone Global Navigation Satellite System) has added an interesting component to satellite observations: it can provide wind speeds in the tropical area with a high repetition rate. Passive microwave sounders that are overpassing the same region can benefit from this information, when it comes to the retrieval of temperature or water profiles: the uncertainty about wind speeds has a strong impact on emissivity and reflectivity calculations with respect to surface temperature. This has strong influences on the uncertainty of retrieval of temperature and water content, especially under extreme weather conditions. Adding CYGNSS information to the retrieval can help to reduce errors and provide a significantly better sounder retrieval. Based on observations during Hurricane Harvey, we want to show the impact of CYGNSS data on the retrieval of passive microwave sensors. We will show examples on the impact on the retrieval from polar orbiting instruments, like the Advanced Technology Microwave Sounder (ATMS) and AMSU-A/B on NOAA-18 and 19. In addition we will also show the impact on retrievals from HAMSR (High Altitude MMIC Sounding Radiometer), which was flying on the Global Hawk during the EPOCH campaign. We will compare the results with other observations and estimate the impact of additional CYGNSS information on the microwave retrieval, especially on the impact in error and uncertainty reduction. We think, that a synergetic use of these different data sources could significantly help to produce better assimilation products for forecast assimilation.

  19. Trial-to-trial dynamics of selective long-term-memory retrieval with continuously changing retrieval targets.

    Science.gov (United States)

    Kizilirmak, Jasmin M; Rösler, Frank; Khader, Patrick H

    2014-10-01

    How do we control the successive retrieval of behaviorally relevant information from long-term memory (LTM) without being distracted by other potential retrieval targets associated to the same retrieval cues? Here, we approach this question by investigating the nature of trial-by-trial dynamics of selective LTM retrieval, i.e., in how far retrieval in one trial has detrimental or facilitatory effects on selective retrieval in the following trial. Participants first learned associations between retrieval cues and targets, with one cue always being linked to three targets, forming small associative networks. In successive trials, participants had to access either the same or a different target belonging to either the same or a different cue. We found that retrieval times were faster for targets that had already been relevant in the previous trial, with this facilitatory effect being substantially weaker when the associative network changed in which the targets were embedded. Moreover, staying within the same network still had a facilitatory effect even if the target changed, which became evident in a relatively higher memory performance in comparison to a network change. Furthermore, event-related brain potentials (ERPs) showed topographically and temporally dissociable correlates of these effects, suggesting that they result from combined influences of distinct processes that aid memory retrieval when relevant and irrelevant targets change their status from trial to trial. Taken together, the present study provides insight into the different processing stages of memory retrieval when fast switches between retrieval targets are required. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. The MIGHTI Wind Retrieval Algorithm: Description and Verification

    Science.gov (United States)

    Harding, Brian J.; Makela, Jonathan J.; Englert, Christoph R.; Marr, Kenneth D.; Harlander, John M.; England, Scott L.; Immel, Thomas J.

    2017-10-01

    We present an algorithm to retrieve thermospheric wind profiles from measurements by the Michelson Interferometer for Global High-resolution Thermospheric Imaging (MIGHTI) instrument on NASA's Ionospheric Connection Explorer (ICON) mission. MIGHTI measures interferometric limb images of the green and red atomic oxygen emissions at 557.7 nm and 630.0 nm, spanning 90-300 km. The Doppler shift of these emissions represents a remote measurement of the wind at the tangent point of the line of sight. Here we describe the algorithm which uses these images to retrieve altitude profiles of the line-of-sight wind. By combining the measurements from two MIGHTI sensors with perpendicular lines of sight, both components of the vector horizontal wind are retrieved. A comprehensive truth model simulation that is based on TIME-GCM winds and various airglow models is used to determine the accuracy and precision of the MIGHTI data product. Accuracy is limited primarily by spherical asymmetry of the atmosphere over the spatial scale of the limb observation, a fundamental limitation of space-based wind measurements. For 80% of the retrieved wind samples, the accuracy is found to be better than 5.8 m/s (green) and 3.5 m/s (red). As expected, significant errors are found near the day/night boundary and occasionally near the equatorial ionization anomaly, due to significant variations of wind and emission rate along the line of sight. The precision calculation includes pointing uncertainty and shot, read, and dark noise. For average solar minimum conditions, the expected precision meets requirements, ranging from 1.2 to 4.7 m/s.

  1. Predicting Software Suitability Using a Bayesian Belief Network

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  2. The contribution of instrumentation and control software to system reliability

    International Nuclear Information System (INIS)

    Fryer, M.O.

    1984-01-01

    Advanced instrumentation and control systems are usually implemented using computers that monitor the instrumentation and issue commands to control elements. The control commands are based on instrument readings and software control logic. The reliability of the total system will be affected by the software design. When comparing software designs, an evaluation of how each design can contribute to the reliability of the system is desirable. Unfortunately, the science of reliability assessment of combined hardware and software systems is in its infancy. Reliability assessment of combined hardware/software systems is often based on over-simplified assumptions about software behavior. A new method of reliability assessment of combined software/hardware systems is presented. The method is based on a procedure called fault tree analysis which determines how component failures can contribute to system failure. Fault tree analysis is a well developed method for reliability assessment of hardware systems and produces quantitative estimates of failure probability based on component failure rates. It is shown how software control logic can be mapped into a fault tree that depicts both software and hardware contributions to system failure. The new method is important because it provides a way for quantitatively evaluating the reliability contribution of software designs. In many applications, this can help guide designers in producing safer and more reliable systems. An application to the nuclear power research industry is discussed

  3. Development of an automation software for reconciliation of INIS/ETDE thesauruses

    International Nuclear Information System (INIS)

    Singh, Manoj; Gupta, Rajiv; Prakasan, E.R.; Vijai Kumar

    1999-01-01

    ETDE (Energy Technology Data Exchange) and INIS (International Nuclear Information System) thesauruses contain nearly twenty thousand descriptors and are not necessarily identical. A project has been undertaken by the international organisations to make a common thesaurus for both INIS and ETDE to facilitate better exchange and retrieval of information between/from these databases. This paper describes the automation implemented during our participation in the project for reconcile the structures of the word blocks in the ETDE and INIS thesauruses, with respect to the descriptors currently in the two thesauruses through a PC based RDBMS Software. The software THEMERGE was developed in FoxPro 2.5 Relational Database Management Systems. The software handles all possible reconcile recommendation suggested by specialist, printing the recommendation sheet for uploading it later. This has not only widened the scope of flexibility, portability and convertibility of recommendations, but also helped to achieve quicker project completion. (author)

  4. Retrieval of Saharan desert dust optical depth from thermal infrared measurements by IASI

    Science.gov (United States)

    Vandenbussche, S.; Kochenova, S.; Vandaele, A.-C.; Kumps, N.; De Mazière, M.

    2012-04-01

    Aerosols are a major actor in the climate system. They are responsible for climate forcing by both direct (by emission, absorption and scattering) and indirect effects (for example, by altering cloud microphysics). A better knowledge of aerosol optical properties, of the atmospheric aerosol load and of aerosol sources and sinks may therefore significantly improve the modeling of climate changes. Aerosol optical depth and other properties are retrieved on an operational basis from daytime measurements in the visible and near infrared spectral range by a number of instruments, like the satellite instruments MODIS, CALIOP, POLDER, MISR and ground-based sunphotometers. Aerosol retrievals from day and night measurements at thermal infrared (TIR) wavelengths (for example, from SEVIRI, AIRS and IASI satellite instruments) are less common, but they receive growing interest in more recent years. Among those TIR measuring instruments, IASI on METOP has one major advantage for aerosol retrievals: its large continuous spectral coverage, allowing to better capture the broadband signature of aerosols. Furthermore, IASI has a high spectral resolution (0.5cm-1 after apodization) which allows retrieving a large number of trace gases at the same time, it will nominally be in orbit for 15 years and offers a quasi global Earth coverage twice a day. Here we will show recently obtained results of desert aerosol properties (concentration, altitude, optical depth) retrieved from IASI TIR measurements, using the ASIMUT software (BIRA-IASB, Belgium) linked to (V)LIDORT (R. Spurr, RTsolutions Inc, US) and to SPHER (M. Mishchenko, NASA GISS, USA). In particular, we will address the case of Saharan desert dust storms, which are a major source of desert dust particles in the atmosphere. Those storms frequently transport sand to Europe, Western Asia or even South America. We will show some test-case comparisons between our retrievals and measurements from other instruments like those listed

  5. Compression and fast retrieval of SNP data.

    Science.gov (United States)

    Sambo, Francesco; Di Camillo, Barbara; Toffolo, Gianna; Cobelli, Claudio

    2014-11-01

    The increasing interest in rare genetic variants and epistatic genetic effects on complex phenotypic traits is currently pushing genome-wide association study design towards datasets of increasing size, both in the number of studied subjects and in the number of genotyped single nucleotide polymorphisms (SNPs). This, in turn, is leading to a compelling need for new methods for compression and fast retrieval of SNP data. We present a novel algorithm and file format for compressing and retrieving SNP data, specifically designed for large-scale association studies. Our algorithm is based on two main ideas: (i) compress linkage disequilibrium blocks in terms of differences with a reference SNP and (ii) compress reference SNPs exploiting information on their call rate and minor allele frequency. Tested on two SNP datasets and compared with several state-of-the-art software tools, our compression algorithm is shown to be competitive in terms of compression rate and to outperform all tools in terms of time to load compressed data. Our compression and decompression algorithms are implemented in a C++ library, are released under the GNU General Public License and are freely downloadable from http://www.dei.unipd.it/~sambofra/snpack.html. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. DALIS: a computer-assisted document retrieval system for the FFTF

    International Nuclear Information System (INIS)

    Harves, W.G.

    1981-01-01

    The FFTF (Fast Flux Test Facility) is a liquid sodium cooled, fast flux reactor designed specifically for irradiation testing of fuels and components for liquid metal fast breeder reactors. The Department of Energy and the Nuclear Regulatory Commission require that all pertinent documentation for maintenance, operation, and safety of the FFTF be readily accessible and retrievable, both during initial startup and for the lifetime of the plant. That amounts to a lot of information which has to be retrievable. The indexing system finally developed is called the DALIS system, short for Document and Location Indexing System. This system was designed by an engineer (Michael Theo) for use by engineers. DALIS uses descriptiors and keywords to identify each document in the system. The descriptors give such information as document number, date of issuance of the document, the title, the originating organization, and the microfilm or hardcopy location of the document. The keywords are words or phrases that describe the content of the document and permit retrieval by means of a computer search for documents with the stated keywords

  7. FORLI radiative transfer and retrieval code for IASI

    International Nuclear Information System (INIS)

    Hurtmans, D.; Coheur, P.-F.; Wespes, C.; Clarisse, L.; Scharf, O.; Clerbaux, C.; Hadji-Lazaro, J.; George, M.; Turquety, S.

    2012-01-01

    This paper lays down the theoretical bases and the methods used in the Fast Optimal Retrievals on Layers for IASI (FORLI) software, which is developed and maintained at the “Université Libre de Bruxelles” (ULB) with the support of the “Laboratoire Atmosphères, Milieux, Observations Spatiales” (LATMOS) to process radiance spectra from the Infrared Atmospheric Sounding Interferometer (IASI) in the perspective of local to global chemistry applications. The forward radiative transfer model (RTM) and the retrieval approaches are formulated and numerical approximations are described. The aim of FORLI is near-real-time provision of global scale concentrations of trace gases from IASI, either integrated over the altitude range of the atmosphere (total columns) or vertically resolved. To this end, FORLI uses precalculated table of absorbances. At the time of writing three gas-specific versions of this algorithm have been set up: FORLI-CO, FORLI-O 3 and FORLI-HNO 3 . The performances of each are reviewed and illustrations of results and early validations are provided, making the link to recent scientific publications. In this paper we stress the challenges raised by near-real-time processing of IASI, shortly describe the processing chain set up at ULB and draw perspectives for future developments and applications.

  8. Survey the role of emotions in information retrieval

    Directory of Open Access Journals (Sweden)

    Hassan Behzadi

    2016-03-01

    Full Text Available The present study was conducted to identify the users' emotion in various stages of information retrieval based on the information retrieval model in web.From the methodological perspective, the present study is experimental, and the type of study is practical. The society comprised all MA students majoring in different humanistic science branches and studying at Imam Reza international university. The sample society of this research consisted of 30 participants. The sample size was determined through stratified random sampling via G*power software. Data collection was carried out by using: demographic and prior experience of using internet questionnaire, post search questionnaire and recorded videos of users' faces. The findings of the study demonstrated that: 1 during the initial stages of searching, the frequency of emotion of apprehension, and in general during the link tracking stage, the negative emotions with the overall 49/3 percent are more frequent than the other emotions in browsing and differentiation stages, the emotion of happy was more frequent than the other emotions. 2 These variances resulted in significant relations among different emotions of the users throughout the four stages of information retrieval. 3 In simple search, the respondents displayed the emotion of happy most frequently and the emotion of aversion least frequently. On the other hand, in complicated search, apprehension and aversion were the most and the least frequently-cited emotions, respectively. Overall, the negative emotions were reported more frequently in complicated search in comparison with the simple search. This demonstrated that any change in the difficulty level of search undertaking would cause users to exhibit different types of emotions.

  9. Instrument control software development process for the multi-star AO system ARGOS

    Science.gov (United States)

    Kulas, M.; Barl, L.; Borelli, J. L.; Gässler, W.; Rabien, S.

    2012-09-01

    The ARGOS project (Advanced Rayleigh guided Ground layer adaptive Optics System) will upgrade the Large Binocular Telescope (LBT) with an AO System consisting of six Rayleigh laser guide stars. This adaptive optics system integrates several control loops and many different components like lasers, calibration swing arms and slope computers that are dispersed throughout the telescope. The purpose of the instrument control software (ICS) is running this AO system and providing convenient client interfaces to the instruments and the control loops. The challenges for the ARGOS ICS are the development of a distributed and safety-critical software system with no defects in a short time, the creation of huge and complex software programs with a maintainable code base, the delivery of software components with the desired functionality and the support of geographically distributed project partners. To tackle these difficult tasks, the ARGOS software engineers reuse existing software like the novel middleware from LINC-NIRVANA, an instrument for the LBT, provide many tests at different functional levels like unit tests and regression tests, agree about code and architecture style and deliver software incrementally while closely collaborating with the project partners. Many ARGOS ICS components are already successfully in use in the laboratories for testing ARGOS control loops.

  10. Manifesto for the Software Development Professionalization

    Directory of Open Access Journals (Sweden)

    Red Latinoamericana en Ingeniería de Software (RedLatinaIS

    2013-12-01

    Full Text Available One of the central problems of current economic development and industrial competitiveness, social and scientific, is the complexity of large and intensive software systems, and processes for their development and implementation. This complexity is defined by the amount and heterogeneity of the interaction of the hardware with the software components, their inter-relationships, of incorporation of the technical and organizational environments, and the interfaces to humans. The domain of these systems requires actions and scientific thoughts, hierarchical and systematic; also, the success of the products, services and organizations, is increasingly determined by the availability of suitable software products. Therefore, highly qualified professionals, able to understand and master the systems, involved in the entire life cycle of software engineering, and adopt different roles during the development. This is the reason that guide the thinking of this Manifesto , which aims is to achieve the Professionalization of Software Development.

  11. The common component architecture for particle accelerator simulations

    International Nuclear Information System (INIS)

    Dechow, D.R.; Norris, B.; Amundson, J.

    2007-01-01

    Synergia2 is a beam dynamics modeling and simulation application for high-energy accelerators such as the Tevatron at Fermilab and the International Linear Collider, which is now under planning and development. Synergia2 is a hybrid, multilanguage software package comprised of two separate accelerator physics packages (Synergia and MaryLie/Impact) and one high-performance computer science package (PETSc). We describe our approach to producing a set of beam dynamics-specific software components based on the Common Component Architecture specification. Among other topics, we describe particular experiences with the following tasks: using Python steering to guide the creation of interfaces and to prototype components; working with legacy Fortran codes; and an example component-based, beam dynamics simulation.

  12. Achieving strategic surety for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1996-09-01

    A strategic surety roadmap for high consequence software systems under the High Integrity Software (HIS) Program at Sandia National Laboratories guides research in identifying methodologies to improve software surety. Selected research tracks within this roadmap are identified and described detailing current technology and outlining advancements to be pursued over the coming decade to reach HIS goals. The tracks discussed herein focus on Correctness by Design, and System Immunology{trademark}. Specific projects are discussed with greater detail given on projects involving Correct Specification via Visualization, Synthesis, & Analysis; Visualization of Abstract Objects; and Correct Implementation of Components.

  13. Retrieval of ion distributions in RC from TWINS ENA images by CT technique

    Science.gov (United States)

    Ma, S.; Yan, W.; Xu, L.; Goldstein, J.; McComas, D. J.

    2010-12-01

    The Two Wide-angle Imaging Neutral-atom Spectrometers (TWINS) mission is the first constellation to employ imagers on two separate spacecraft to measure energetic neutral atoms (ENA) produced by charge exchange between ring current energetic ions and cold exospheric neutral atoms. By applying the 3-D volumetric pixel (voxel) computed tomography (CT) inversion method to TWINS images, parent ion populations in the ring current (RC) and auroral regions are retrieved from their ENA signals. This methodology is implemented for data obtained during the main phase of a moderate geomagnetic storm on 11 October 2008. For this storm the two TWINS satellites were located in nearly the same meridian plane at vantage points widely separated in magnetic local time, and both more than 5 RE geocentric distance from the Earth. In the retrieval process, the energetic ion fluxes to be retrieved are assumed being isotropic with respect to pitch angle. The ENA data used in this study are differential fluxes averaged over 12 sweeps (corresponding to an interval of 16 min.) at different energy levels ranging throughout the full 1--100 keV energy range of TWINS. The ENA signals have two main components: (1) a low-latitude/ high-altitude signal from trapped RC ions and (2) a low-altitude signal from precipitating ions in the auroral/subauroral ionosphere. In the retrieved ion distributions, the main part of the RC component is located around midnight toward dawn sector with L from 3 to 7 or farther, while the subauroral low-altitude component is mainly at pre-midnight. It seems that the dominant energy of the RC ions for this storm is at the lowest energy level of 1-2 keV, with another important energy band centered about 44 keV. The low-altitude component is consistent with in situ observations by DMSP/SSJ4. The result of this study demonstrates that with satellite constellations such as TWINS, using all-sky ENA imagers deployed at multiple vantage points, 3-D distribution of RC ion

  14. RAGE Architecture for Reusable Serious Gaming Technology Components

    Directory of Open Access Journals (Sweden)

    Wim van der Vegt

    2016-01-01

    Full Text Available For seizing the potential of serious games, the RAGE project—funded by the Horizon-2020 Programme of the European Commission—will make available an interoperable set of advanced technology components (software assets that support game studios at serious game development. This paper describes the overall software architecture and design conditions that are needed for the easy integration and reuse of such software assets in existing game platforms. Based on the component-based software engineering paradigm the RAGE architecture takes into account the portability of assets to different operating systems, different programming languages, and different game engines. It avoids dependencies on external software frameworks and minimises code that may hinder integration with game engine code. Furthermore it relies on a limited set of standard software patterns and well-established coding practices. The RAGE architecture has been successfully validated by implementing and testing basic software assets in four major programming languages (C#, C++, Java, and TypeScript/JavaScript, resp.. Demonstrator implementation of asset integration with an existing game engine was created and validated. The presented RAGE architecture paves the way for large scale development and application of cross-engine reusable software assets for enhancing the quality and diversity of serious gaming.

  15. Introduction of the digitization software GDgraph

    International Nuclear Information System (INIS)

    Chen Guochang; Jin Yongli; Wang Jimin

    2015-01-01

    The evaluators and experimenters always desire to have full and latest experimental data sets. However, the data are often published as figures without any numerical values for some publications or journals. Furthermore, the quality of figures is not always good enough, especially for some figures scanned from the hard copy of old publications. On the other hand, the researchers would like to retrieve the data directly from EXFOR database. Digitization of figures is only one method to obtain the numerical data and correlative uncertainty, when there are only figures available from publications. Therefore we need a digitization tool to fit the requirements from evaluation, measurement and EXFOR compilation in CNDC. Before 2000, there have no common software to digitize experimental and evaluated data. And the quality of digitization results can not fit the requirements of evaluation and measurement using the traditional coordinate paper or rule. The end of twenty century, the PC was developed so quickly that to develop a software for digitization purpose become possible. Since 1997, CNDC devotes to design and develop a software for digitization. Four years later, the first version of digitization software GDGraph was developed using VC++ and released in CNDC. Although, the functions of the 1 st version of GDGraph is fit the basic requirements of digitization only, in which can digitize one group data excluding data error, BMP image format only, and it can not randomly delete digitizing data points. However, it obtains higher quality digitizing results and efficiency than the traditional method

  16. Multilevel architectures for electronic document retrieval

    International Nuclear Information System (INIS)

    Rome, J.A.; Tolliver, J.S.

    1997-01-01

    Traditionally, most classified computer systems run at the highest level of any of the data on the system, and all users must be cleared to this security level. This architecture precludes the use of low-level (pay and clearance) personnel for such tasks as data entry, and makes sharing data with other entities difficult. The government is trying to solve this problem by the introduction of multilevel-secure (MLS) computer systems. In addition, wherever possible, there is pressure to use commercial off-the-shelf software (COTS) to improve reliability, and to reduce purchase and maintenance costs. This paper presents two architectures for an MLS electronic document retrieval system using COTS products. Although the authors believe that the resulting systems represent a real advance in usability, scaleability, and scope, the disconnect between existing security rules and regulations and the rapidly-changing state of technology will make accreditation of such systems a challenge

  17. On the Antecedents of an Electrophysiological Signature of Retrieval Mode.

    Directory of Open Access Journals (Sweden)

    Angharad N Williams

    Full Text Available It has been proposed that people employ a common set of sustained operations (retrieval mode when preparing to remember different kinds of episodic information. In two experiments, however, there was no evidence for the pattern of brain activity commonly assumed to index these operations. In both experiments event-related potentials (ERPs were recorded time-locked to alternating preparatory cues signalling that participants should prepare for different retrieval tasks. One cue signalled episodic retrieval: remember the location where the object was presented in a prior study phase. The other signalled semantic retrieval: identify the location where the object is most commonly found (Experiment 1 or identify the typical size of the object (Experiment 2. In both experiments, only two trials of the same task were completed in succession. This enabled ERP contrasts between 'repeat' trials (the cue on the preceding trial signalled the same retrieval task, and 'switch' trials (the cue differed from the preceding trial. There were differences between the ERPs elicited by the preparatory task cues in Experiment 1 only: these were evident only on switch trials and comprised more positive-going activity over right-frontal scalp for the semantic than for the episodic task. These findings diverge from previous outcomes where the activity differentiating cues signalling preparation for episodic or semantic retrieval has been restricted to right-frontal scalp sites, comprising more positive-going activity for the episodic than for the semantic task. While these findings are consistent with the view that there is not a common set of operations engaged when people prepare to remember different kinds of episodic information, an alternative account is offered here, which is that these outcomes are a consequence of structural and temporal components of the experiment designs.

  18. On the Antecedents of an Electrophysiological Signature of Retrieval Mode.

    Science.gov (United States)

    Williams, Angharad N; Evans, Lisa H; Herron, Jane E; Wilding, Edward L

    2016-01-01

    It has been proposed that people employ a common set of sustained operations (retrieval mode) when preparing to remember different kinds of episodic information. In two experiments, however, there was no evidence for the pattern of brain activity commonly assumed to index these operations. In both experiments event-related potentials (ERPs) were recorded time-locked to alternating preparatory cues signalling that participants should prepare for different retrieval tasks. One cue signalled episodic retrieval: remember the location where the object was presented in a prior study phase. The other signalled semantic retrieval: identify the location where the object is most commonly found (Experiment 1) or identify the typical size of the object (Experiment 2). In both experiments, only two trials of the same task were completed in succession. This enabled ERP contrasts between 'repeat' trials (the cue on the preceding trial signalled the same retrieval task), and 'switch' trials (the cue differed from the preceding trial). There were differences between the ERPs elicited by the preparatory task cues in Experiment 1 only: these were evident only on switch trials and comprised more positive-going activity over right-frontal scalp for the semantic than for the episodic task. These findings diverge from previous outcomes where the activity differentiating cues signalling preparation for episodic or semantic retrieval has been restricted to right-frontal scalp sites, comprising more positive-going activity for the episodic than for the semantic task. While these findings are consistent with the view that there is not a common set of operations engaged when people prepare to remember different kinds of episodic information, an alternative account is offered here, which is that these outcomes are a consequence of structural and temporal components of the experiment designs.

  19. Innovative grout/retrieval demonstration final report

    International Nuclear Information System (INIS)

    Loomis, G.G.; Thompson, D.N.

    1995-01-01

    This report presents the results of an evaluation of an innovative retrieval technique for buried transuranic waste. Application of this retrieval technique was originally designed for full pit retrieval; however, it applies equally to a hot spot retrieval technology. The technique involves grouting the buried soil waste matrix with a jet grouting procedure, applying an expansive demolition grout to the matrix, and retrieving the debris. The grouted matrix provides an agglomeration of fine soil particles and contaminants resulting in an inherent contamination control during the dusty retrieval process. A full-scale field demonstration of this retrieval technique was performed on a simulated waste pit at the Idaho National Engineering Laboratory. Details are reported on all phases of this proof-of-concept demonstration including pit construction, jet grouting activities, application of the demolition grout, and actual retrieval of the grouted pit. A quantitative evaluation of aerosolized soils and rare earth tracer spread is given for all phases of the demonstration, and these results are compared to a baseline retrieval activity using conventional retrieval means. 8 refs., 47 figs., 10 tabs

  20. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  1. The ATLAS data management software engineering process

    International Nuclear Information System (INIS)

    Lassnig, M; Garonne, V; Stewart, G A; Barisits, M; Serfon, C; Goossens, L; Nairz, A; Beermann, T; Vigne, R; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  2. The ATLAS data management software engineering process

    Science.gov (United States)

    Lassnig, M.; Garonne, V.; Stewart, G. A.; Barisits, M.; Beermann, T.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  3. Future Scenarios for Software-Defined Metro and Access Networks and Software-Defined Photonics

    Directory of Open Access Journals (Sweden)

    Tommaso Muciaccia

    2017-01-01

    Full Text Available In recent years, architectures, devices, and components in telecommunication networks have been challenged by evolutionary and revolutionary factors which are drastically changing the traffic features. Most of these changes imply the need for major re-configurability and programmability not only in data-centers and core networks, but also in the metro-access segment. In a wide variety of contexts, this necessity has been addressed by the proposed introduction of the innovative paradigm of software-defined networks (SDNs. Several solutions inspired by the SDN model have been recently proposed also for metro and access networks, where the adoption of a new generation of software-defined reconfigurable integrated photonic devices is highly desirable. In this paper, we review the possible future application scenarios for software-defined metro and access networks and software-defined photonics (SDP, on the base of analytics, statistics, and surveys. This work describes the reasons underpinning the presented radical change of paradigm and summarizes the most significant solutions proposed in literature, with a specific emphasis to physical-layer reconfigurable networks and a focus on both architectures and devices.

  4. Topological Aspects of Information Retrieval.

    Science.gov (United States)

    Egghe, Leo; Rousseau, Ronald

    1998-01-01

    Discusses topological aspects of theoretical information retrieval, including retrieval topology; similarity topology; pseudo-metric topology; document spaces as topological spaces; Boolean information retrieval as a subsystem of any topological system; and proofs of theorems. (LRW)

  5. COMDES-II: A Component-Based Framework for Generative Development of Distributed Real-Time Control Systems

    DEFF Research Database (Denmark)

    Ke, Xu; Sierszecki, Krzysztof; Angelov, Christo K.

    2007-01-01

    The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher lev...... methodology for COMDES-II from a general perspective, describes the component models in details and demonstrates their application through a DC-Motor control system case study.......The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher level...

  6. Complex Retrieval of Embedded IVC Filters: Alternative Techniques and Histologic Tissue Analysis

    International Nuclear Information System (INIS)

    Kuo, William T.; Cupp, John S.; Louie, John D.; Kothary, Nishita; Hofmann, Lawrence V.; Sze, Daniel Y.; Hovsepian, David M.

    2012-01-01

    Purpose: We evaluated the safety and effectiveness of alternative endovascular methods to retrieve embedded optional and permanent filters in order to manage or reduce risk of long-term complications from implantation. Histologic tissue analysis was performed to elucidate the pathologic effects of chronic filter implantation. Methods: We studied the safety and effectiveness of alternative endovascular methods for removing embedded inferior vena cava (IVC) filters in 10 consecutive patients over 12 months. Indications for retrieval were symptomatic chronic IVC occlusion, caval and aortic perforation, and/or acute PE (pulmonary embolism) from filter-related thrombus. Retrieval was also performed to reduce risk of complications from long-term filter implantation and to eliminate the need for lifelong anticoagulation. All retrieved specimens were sent for histologic analysis. Results: Retrieval was successful in all 10 patients. Filter types and implantation times were as follows: one Venatech (1,495 days), one Simon-Nitinol (1,485 days), one Optease (300 days), one G2 (416 days), five Günther-Tulip (GTF; mean 606 days, range 154–1,010 days), and one Celect (124 days). There were no procedural complications or adverse events at a mean follow-up of 304 days after removal (range 196–529 days). Histology revealed scant native intima surrounded by a predominance of neointimal hyperplasia and dense fibrosis in all specimens. Histologic evidence of photothermal tissue ablation was confirmed in three laser-treated specimens. Conclusion: Complex retrieval methods can now be used in select patients to safely remove embedded optional and permanent IVC filters previously considered irretrievable. Neointimal hyperplasia and dense fibrosis are the major components that must be separated to achieve successful retrieval of chronic filter implants.

  7. Advances in audio source seperation and multisource audio content retrieval

    Science.gov (United States)

    Vincent, Emmanuel

    2012-06-01

    Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.

  8. RETRIEVAL EQUIPMENT DESCRIPTIONS

    International Nuclear Information System (INIS)

    J. Steinhoff

    1997-01-01

    The objective and the scope of this document are to list and briefly describe the major mobile equipment necessary for waste package (WP) retrieval from the proposed subsurface nuclear waste repository at Yucca Mountain. Primary performance characteristics and some specialized design features of the equipment are explained and summarized in the individual subsections of this document. There are no quality assurance requirements or QA controls in this document. Retrieval under normal conditions is accomplished with the same fleet of equipment as is used for emplacement. Descriptions of equipment used for retrieval under normal conditions is found in Emplacement Equipment Descriptions, DI: BCAF00000-01717-5705-00002 (a document in progress). Equipment used for retrieval under abnormal conditions is addressed in this document and consists of the following: (1) Inclined Plane Hauler; (2) Bottom Lift Transporter; (3) Load Haul Dump (LHD) Loader; (4) Heavy Duty Forklift for Emplacement Drifts; (5) Covered Shuttle Car; (6) Multipurpose Vehicle; and (7) Scaler

  9. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    Directory of Open Access Journals (Sweden)

    Cherian Mathew

    2014-12-01

    Full Text Available The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users.

  10. Dictionary Pruning with Visual Word Significance for Medical Image Retrieval.

    Science.gov (United States)

    Zhang, Fan; Song, Yang; Cai, Weidong; Hauptmann, Alexander G; Liu, Sidong; Pujol, Sonia; Kikinis, Ron; Fulham, Michael J; Feng, David Dagan; Chen, Mei

    2016-02-12

    Content-based medical image retrieval (CBMIR) is an active research area for disease diagnosis and treatment but it can be problematic given the small visual variations between anatomical structures. We propose a retrieval method based on a bag-of-visual-words (BoVW) to identify discriminative characteristics between different medical images with Pruned Dictionary based on Latent Semantic Topic description. We refer to this as the PD-LST retrieval. Our method has two main components. First, we calculate a topic-word significance value for each visual word given a certain latent topic to evaluate how the word is connected to this latent topic. The latent topics are learnt, based on the relationship between the images and words, and are employed to bridge the gap between low-level visual features and high-level semantics. These latent topics describe the images and words semantically and can thus facilitate more meaningful comparisons between the words. Second, we compute an overall-word significance value to evaluate the significance of a visual word within the entire dictionary. We designed an iterative ranking method to measure overall-word significance by considering the relationship between all latent topics and words. The words with higher values are considered meaningful with more significant discriminative power in differentiating medical images. We evaluated our method on two public medical imaging datasets and it showed improved retrieval accuracy and efficiency.

  11. Improving the software fault localization process through testability information

    NARCIS (Netherlands)

    Gonzalez-Sanchez, A.; Abreu, R.; Gross, H.; Van Gemund, A.

    2010-01-01

    When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow

  12. Designing dependable process-oriented software - a CSP-based approach

    NARCIS (Netherlands)

    Jovanovic, D.S.

    2006-01-01

    This thesis advocates dependability as a crucial aspect of software quality. Process orientation,as it is defined in this thesis, concentrates on the notion of a process as a basic building component of a dataflow-centred software architecture. The dependability approach in the proposed variant of

  13. Open Component Portability Infrastructure (OPENCPI)

    Science.gov (United States)

    2013-03-01

    declaration of the authored worker that must be implemented . If there is existing legacy VHDL entity architecture, then it is wrapped or modified to...that the software implementations were written to. Since all of the original code was VHDL , the HDL Authoring Model for VHDL was enhanced to meet...engineering process. This application was completed for the execution of all the components, the software implementations , and the VHDL skeletons for the

  14. The software and algorithms for hyperspectral data processing

    Science.gov (United States)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages

  15. Multimedia medical data archive and retrieval server on the Internet

    Science.gov (United States)

    Komo, Darmadi; Levine, Betty A.; Freedman, Matthew T.; Mun, Seong K.; Tang, Y. K.; Chiang, Ted T.

    1997-05-01

    The Multimedia Medical Data Archive and Retrieval Server has been installed at the imaging science and information systems (ISIS) center in Georgetown University Medical Center to provide medical data archive and retrieval support for medical researchers. The medical data includes text, images, sound, and video. All medical data is keyword indexed using a database management system and placed temporarily in a staging area and then transferred to a StorageTek one terabyte tape library system with a robotic arm for permanent archive. There are two methods of interaction with the system. The first method is to use a web browser with HTML functions to perform insert, query, update, and retrieve operations. These generate dynamic SQL calls to the database and produce StorageTek API calls to the tape library. The HTML functions consist of a database, StorageTek interface, HTTP server, common gateway interface, and Java programs. The second method is to issue a DICOM store command, which is translated by the system's DICOM server to SQL calls and then produce StorageTek API calls to the tape library. The system performs as both an Internet and a DICOM server using standard protocols such as HTTP, HTML, Java, and DICOM. Users with proper authentication can log on to the server from anywhere on the Internet using a standard web browser resulting in a user-friendly, open environment, and platform independent solution for archiving multimedia medical data. It represents a complex integration of different components including a robotic tape storage system, database, user-interface, WWW protocols, and TCP/IP networking. The user will only deal with the WWW and DICOM server components of the system, the database and robotic tape library system are transparent and the user will not know that the medical data is stored on magnetic tapes. The server provides the researchers a cost-effective tool for archiving and retrieving medical data across a TCP/IP network environment. It will

  16. Business engineering. Generic Software Architecture in an Object Oriented View

    Directory of Open Access Journals (Sweden)

    Mihaela MURESAN

    2006-01-01

    Full Text Available The generic software architecture offers a solution for the the information system's development and implementation. A generic software/non-software model could be developed by integrating the enterprise blueprint concept (Zachman and the object oriented paradigm (Coad's archetype concept. The standardization of the generic software architecture for various specific software components could be a direction of crucial importance, offering the guarantee of the quality of the model and increasing the efficiency of the design, development and implementation of the software. This approach is also useful for the implementation of the ERP systems designed to fit the user’s particular requirements.

  17. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  18. Laparoscopic specimen retrieval bags.

    Science.gov (United States)

    Smorgick, Noam

    2014-10-01

    Specimen retrieval bags have long been used in laparoscopic gynecologic surgery for contained removal of adnexal cysts and masses. More recently, the concerns regarding spread of malignant cells during mechanical morcellation of myoma have led to an additional use of specimen retrieval bags for contained "in-bag" morcellation. This review will discuss the indications for use retrieval bags in gynecologic endoscopy, and describe the different specimen bags available to date.

  19. Automated Improvement of Software Architecture Models for Performance and Other Quality Attributes

    OpenAIRE

    Koziolek, Anne

    2013-01-01

    Quality attributes, such as performance or reliability, are crucial for the success of a software system and largely influenced by the software architecture. Their quantitative prediction supports systematic, goal-oriented software design and forms a base of an engineering approach to software design. This thesis proposes a method and tool to automatically improve component-based software architecture (CBA) models based on such quantitative quality prediction techniques.

  20. Exploiting database technology for object based event storage and retrieval

    International Nuclear Information System (INIS)

    Rawat, Anil; Rajan, Alpana; Tomar, Shailendra Singh; Bansal, Anurag

    2005-01-01

    This paper discusses the storage and retrieval of experimental data on relational databases. Physics experiments carried out using reactors and particle accelerators, generate huge amount of data. Also, most of the data analysis and simulation programs are developed using object oriented programming concepts. Hence, one of the most important design features of an experiment related software framework is the way object persistency is handled. We intend to discuss these issues in the light of the module developed by us for storing C++ objects in relational databases like Oracle. This module was developed under the POOL persistency framework being developed for LHC, CERN grid. (author)

  1. A role for relational databases in high energy physics software systems

    International Nuclear Information System (INIS)

    Lauer, R.; Slaughter, A.J.; Wolin, E.

    1987-01-01

    This paper presents the design and initial implementation of software which uses a relational database management system for storage and retrieval of real and Monte Carlo generated events from a charm and beauty spectrometer with a vertex detector. The purpose of the software is to graphically display and interactively manipulate the events, fit tracks and vertices and calculate physics quantities. The INGRES database forms the core of the system, while the DI3000 graphics package is used to plot the events. The paper introduces relational database concepts and their applicability to high energy physics data. It also evaluates the environment provided by INGRES, particularly its usefulness in code development and its Fortran interface. Specifics of the database design we have chosen are detailed as well. (orig.)

  2. Retrieving autobiographical memories: How different retrieval strategies associated with different cues explain reaction time differences.

    Science.gov (United States)

    Uzer, Tugba

    2016-02-01

    Previous research has shown that memories cued by concrete concepts, such as objects, are retrieved faster than those cued by more abstract concepts, such as emotions. This effect has been explained by the fact that more memories are directly retrieved from object versus emotion cues. In the present study, we tested whether RT differences between memories cued by emotion versus object terms occur not only because object cues elicit direct retrieval of more memories (Uzer, Lee, & Brown, 2012), but also because of differences in memory generation in response to emotions versus objects. One hundred university students retrieved memories in response to basic-level (e.g. orange), superordinate-level (e.g. plant), and emotion (e.g. surprised) cues. Retrieval speed was measured and participants reported whether memories were directly retrieved or generated on each trial. Results showed that memories were retrieved faster in response to basic-level versus superordinate-level and emotion cues because a) basic-level cues elicited more directly retrieved memories, and b) generating memories was more difficult when cues were abstract versus concrete. These results suggest that generative retrieval is a cue generation process in which additional cues that provide contextual information including the target event are produced. Memories are retrieved more slowly in response to emotion cues in part because emotion labels are less effective cues of appropriate contextual information. This particular finding is inconsistent with the idea that emotion is a primary organizational unit for autobiographical memories. In contrast, the difficulty of emotional memory generation implies that emotions represent low-level event information in the organization of autobiographical memory. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Memory networks supporting retrieval effort and retrieval success under conditions of full and divided attention.

    Science.gov (United States)

    Skinner, Erin I; Fernandes, Myra A; Grady, Cheryl L

    2009-01-01

    We used a multivariate analysis technique, partial least squares (PLS), to identify distributed patterns of brain activity associated with retrieval effort and retrieval success. Participants performed a recognition memory task under full attention (FA) or two different divided attention (DA) conditions during retrieval. Behaviorally, recognition was disrupted when a word, but not digit-based distracting task, was performed concurrently with retrieval. PLS was used to identify patterns of brain activation that together covaried with the three memory conditions and which were functionally connected with activity in the right hippocampus to produce successful memory performance. Results indicate that activity in the right dorsolateral frontal cortex increases during conditions of DA at retrieval, and that successful memory performance in the DA-digit condition is associated with activation of the same network of brain regions functionally connected to the right hippocampus, as under FA, which increases with increasing memory performance. Finally, DA conditions that disrupt successful memory performance (DA-word) interfere with recruitment of both retrieval-effort and retrieval-success networks.

  4. Influence of gravity compensation on muscle activity during reach and retrieval in healthy elderly.

    NARCIS (Netherlands)

    Prange, Grada Berendina; Kallenberg, L.A.C.; Jannink, M.J.A.; Stienen, Arno; van der Kooij, Herman; IJzerman, Maarten Joost; Hermens, Hermanus J.

    2007-01-01

    INTRODUCTION: Arm support like gravity compensation may improve arm movements during stroke rehabilitation. It is unknown how gravity compensation affects muscle activation patterns during reach and retrieval movements. Since muscle activity during reach is represented by a component varying with

  5. RT-Syn: A real-time software system generator

    Science.gov (United States)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  6. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  7. Techniques and tools for software qualification in KNICS

    International Nuclear Information System (INIS)

    Cha, Kyung H.; Lee, Yeong J.; Cheon, Se W.; Kim, Jang Y.; Lee, Jang S.; Kwon, Kee C.

    2004-01-01

    This paper describes techniques and tools for qualifying safety software in Korea Nuclear Instrumentation and Control System (KNICS). Safety software are developed and applied for a Reactor Protection System (RPS), an Engineered Safety Features and Component Control System (ESF-CCS), and a safety Programmable Logic Controller (PLC) in the KNICS. Requirements and design specifications of safety software are written by both natural language and formal specification languages. Statechart is used for formal specification of software of the ESF-CCS and the safety PLC while NuSCR is used for formal specification of them of the RPS. pSET (POSCON Software Engineering Tool) as a software development tool has been developed and utilized for the IEC61131-3 based PLC programming. The qualification of the safety software consists of software verification and validation (V and V) through software life cycle, software safety analysis, and software configuration management, software quality assurance, and COTS (Commercial-Off-The-Shelf) dedication. The criteria and requirements for qualifying the safety software have been established with them in Software Review Plan (SRP)/Branch Technical Positions (BTP)-14, IEEE Std. 7-4.3.2-1998, NUREG/CR-6463, IEEE Std. 1012-1998, and so on. Figure 1 summarizes qualification techniques and tools for the safety software

  8. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  9. The Wikipedia Image Retrieval Task

    NARCIS (Netherlands)

    T. Tsikrika (Theodora); J. Kludas

    2010-01-01

    htmlabstractThe wikipedia image retrieval task at ImageCLEF provides a testbed for the system-oriented evaluation of visual information retrieval from a collection of Wikipedia images. The aim is to investigate the effectiveness of retrieval approaches that exploit textual and visual evidence in the

  10. DORS: DDC Online Retrieval System.

    Science.gov (United States)

    Liu, Songqiao; Svenonius, Elaine

    1991-01-01

    Describes the Dewey Online Retrieval System (DORS), which was developed at the University of California, Los Angeles (UCLA), to experiment with classification-based search strategies in online catalogs. Classification structures in automated information retrieval are discussed; and specifications for a classification retrieval interface are…

  11. Single-Shell Tank (SST) Retrieval Project Plan for Tank 241-C-104 Retrieval

    International Nuclear Information System (INIS)

    DEFIGH PRICE, C.

    2000-01-01

    In support of the SST Interim Closure Project, Project W-523 ''Tank 241-C-104 Waste Retrieval System'' will provide systems for retrieval and transfer of radioactive waste from tank 241-C-104 (C-104) to the DST staging tank 241-AY-101 (AY-101). At the conclusion of Project W-523, a retrieval system will have been designed and tested to meet the requirements for Acceptance of Beneficial Use and been turned over to operations. Completion of construction and operations of the C-104 retrieval system will meet the recently proposed near-term Tri-Party Agreement milestone, M-45-03F (Proposed Tri-Party Agreement change request M-45-00-01A, August, 30 2000) for demonstrating limits of retrieval technologies on sludge and hard heels in SSTs, reduce near-term storage risks associated with aging SSTs, and provide feed for the tank waste treatment plant. This Project Plan documents the methodology for managing Project W-523; formalizes responsibilities; identifies key interfaces required to complete the retrieval action; establishes the technical, cost, and schedule baselines; and identifies project organizational requirements pertaining to the engineering process such as environmental, safety, quality assurance, change control, design verification, testing, and operational turnover

  12. Flexible event reconstruction software chains with the ALICE High-Level Trigger

    International Nuclear Information System (INIS)

    Ram, D; Breitner, T; Szostak, A

    2012-01-01

    The ALICE High-Level Trigger (HLT) has a large high-performance computing cluster at CERN whose main objective is to perform real-time analysis on the data generated by the ALICE experiment and scale it down to at-most 4GB/sec - which is the current maximum mass-storage bandwidth available. Data-flow in this cluster is controlled by a custom designed software framework. It consists of a set of components which can communicate with each other via a common control interface. The software framework also supports the creation of different configurations based on the detectors participating in the HLT. These configurations define a logical data processing “chain” of detector data-analysis components. Data flows through this software chain in a pipelined fashion so that several events can be processed at the same time. An instance of such a chain can run and manage a few thousand physics analysis and data-flow components. The HLT software and the configuration scheme used in the 2011 heavy-ion runs of ALICE, has been discussed in this contribution.

  13. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs.

  14. Review of Software Reliability Assessment Methodologies for Digital I and C Software of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Cho, Jae Hyun; Lee, Seung Jun; Jung, Won Dea

    2014-01-01

    Digital instrumentation and control (I and C) systems are increasingly being applied to current nuclear power plants (NPPs) due to its advantages; zero drift, advanced data calculation capacity, and design flexibility. Accordingly, safety issues of software that is main part of the digital I and C system have been raised. As with hardware components, the software failure in NPPs could lead to a large disaster, therefore failure rate test and reliability assessment of software should be properly performed, and after that adopted in NPPs. However, the reliability assessment of the software is quite different with that of hardware, owing to the nature difference between software and hardware. The one of the most different thing is that the software failures arising from design faults as 'error crystal', whereas the hardware failures are caused by deficiencies in design, production, and maintenance. For this reason, software reliability assessment has been focused on the optimal release time considering the economy. However, the safety goal and public acceptance of the NPPs is so distinctive with other industries that the software in NPPs is dependent on reliability quantitative value rather than economy. The safety goal of NPPs compared to other industries is exceptionally high, so conventional methodologies on software reliability assessment already used in other industries could not adjust to safety goal of NPPs. Thus, the new reliability assessment methodology of the software of digital I and C on NPPs need to be developed. In this paper, existing software reliability assessment methodologies are reviewed to obtain the pros and cons of them, and then to assess the usefulness of each method to software of NPPs

  15. Retrieval process development and enhancements: Hydraulic test bed integrated testing. Fiscal year 1995 technology development summary report

    International Nuclear Information System (INIS)

    Hatchell, B.K.; Smalley, J.T.; Tucker, J.C.

    1996-02-01

    The Retrieval Process Development and Enhancements Program is sponsored by the U.S. Department of Energy (DOE) Office of Science and Technology to investigate waste dislodging and conveyance processes suitable for the retrieval of high-level radioactive waste. This program, represented by industry, national laboratories, and academia, is testing the performance of a technology of high-pressure waterjet dislodging and pneumatic conveyance integrated as a scarifier as a means of retrieval of waste inside waste storage tanks. Waste stimulants have been designed to challenge this retrieval process, and this technology has been shown to mobilize and convey the waste stimulants, at target retrieval rates while operating within the space envelope and the dynamic loading constraints of postulated deployment systems. The approach has been demonstrated to be versatile in dislodging and conveying a broad range of waste forms, from hard wastes to soft sludge wastes, through the use of simple and reliable in-tank components

  16. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  17. Applying the metro map to software development management

    Science.gov (United States)

    Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción

    2010-01-01

    This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.

  18. Terra Harvest software architecture

    Science.gov (United States)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  19. Test process for the safety-critical embedded software

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju; Lee, Jangsoo

    2004-01-01

    Digitalization of nuclear Instrumentation and Control (I and C) system requires high reliability of not only hardware but also software. Verification and Validation (V and V) process is recommended for software reliability. But a more quantitative method is necessary such as software testing. Most of software in the nuclear I and C system is safety-critical embedded software. Safety-critical embedded software is specified, verified and developed according to V and V process. Hence two types of software testing techniques are necessary for the developed code. First, code-based software testing is required to examine the developed code. Second, after code-based software testing, software testing affected by hardware is required to reveal the interaction fault that may cause unexpected results. We call the testing of hardware's influence on software, an interaction testing. In case of safety-critical embedded software, it is also important to consider the interaction between hardware and software. Even if no faults are detected when testing either hardware or software alone, combining these components may lead to unexpected results due to the interaction. In this paper, we propose a software test process that embraces test levels, test techniques, required test tasks and documents for safety-critical embedded software. We apply the proposed test process to safety-critical embedded software as a case study, and show the effectiveness of it. (author)

  20. Development of retrievability plans

    International Nuclear Information System (INIS)

    Richardson, P.J.

    1999-03-01

    It has become clear, from monitoring of many national programmes for siting of final repositories for radioactive waste disposal, that the potential or otherwise for retrievability of emplaced wastes is the one issue in particular which is repeatedly raised during public consultation and interaction. Although even those repositories which may be constructed over the next decades will operate for many decades more and be sealed only after a long-term monitoring phase, there is little operational pressure to finalise retrievability concepts. However, as siting processes require detailed conceptual designs to be developed, as do the associated safety assessment exercises, it is becoming increasingly recognised that the potential for retrieval must be examined now. This report is the culmination of a short project carried out for the Swedish National Co-ordinator for Nuclear Waste Disposal to examine the situation as regards the development and possible implementation of retrievability as an integral part of a disposal concept for nuclear waste. Because of the short work period involved, it can at best be only an overview, designed to provide a broad picture of current plans. The Swedish Nuclear Power Inspectorate has begun to examine the issue, and a report is due later in 1999. A major collaborative investigation, which began in March 1998, is also currently underway under the auspices of the EU, but only involves implementing agencies from the various Member States. This report is intended to serve as background to these other studies when they appear. Utilising currently available information, as well as personal contacts, those countries currently examining retrievability or reversibility of disposal in some form have been identified. Information regarding these proposals has been collated, and contact made with relevant agencies and national regulatory bodies where possible. The report includes some review of the technical aspects of retrievability, with especial