WorldWideScience

Sample records for globally distributed software

  1. Management of Globally Distributed Component-Based Software Development Projects

    NARCIS (Netherlands)

    J. Kotlarsky (Julia)

    2005-01-01

    textabstractGlobally Distributed Component-Based Development (GD CBD) is expected to become a promising area, as increasing numbers of companies are setting up software development in a globally distributed environment and at the same time are adopting CBD methodologies. Being an emerging area, the

  2. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  3. A conceptual framework to study the role of communication through social software for coordination in globally-distributed software teams

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2015-01-01

    Background In Global Software Development (GSD) the lack of face-to-face communication is a major challenge and effective computer-mediated practices are necessary to mitigate the effect of physical distance. Communication through Social Software (SoSo) supports team coordination, helping to deal...... with geographical distance; however, in Software Engineering literature, there is a lack of suitable theoretical concepts to analyze and describe everyday practices of globally-distributed software development teams and to study the role of communication through SoSo. Objective The paper proposes a theoretical...... framework for analyzing how communicative and coordinative practices are constituted and maintained in globally-distributed teams. Method The framework is based on the concepts of communicative genres and coordination mechanisms; it is motivated and explicated through examples from two qualitative empirical...

  4. A cloud based model to facilitate software development uutsourcing to globally distributed locations

    OpenAIRE

    Hashmi, Sajid Ibrahim; Richardson, Ita

    2013-01-01

    peer-reviewed Outsourcing is an essential part of global software development and entails software development distributed across geographical borders. More specifically, it deals with software development teams dispersed across multiple geographical locations to carry out software development activities. By means of this business model, organizations expect to benefit from enhanced corporate value through advantages such as round the clock software development, availability of skills and ...

  5. Global Software and IT A Guide to Distributed Development, Projects, and Outsourcing

    CERN Document Server

    Ebert, Christof

    2011-01-01

    Global software engineering, implying both internal and outsourced development, is a fast-growing scenario within industry; the growth rates in some sectors are more than 20% per year. However, half of all offshoring activities are cancelled within the first 2 years, at tremendous unanticipated cost to the organization.   This book will provide a more balanced framework for planning global development, covering topics such as managing people in distributed sites, managing a project across locations, mitigating the risk of offshoring, processes for global development, practical outsourcin

  6. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  7. Management of Globally Distributed Software Development Projects in Multiple-Vendor Constellations

    Science.gov (United States)

    Schott, Katharina; Beck, Roman; Gregory, Robert Wayne

    Global information systems development outsourcing is an apparent trend that is expected to continue in the foreseeable future. Thereby, IS-related services are not only increasingly provided from different geographical sites simultaneously but beyond that from multiple service providers based in different countries. The purpose of this paper is to understand how the involvement of multiple service providers affects the management of the globally distributed information systems development projects. As research on this topic is scarce, we applied an exploratory in-depth single-case study design as research approach. The case we analyzed comprises a global software development outsourcing project initiated by a German bank together with several globally distributed vendors. For data collection and data analysis we have adopted techniques suggested by the grounded theory method. Whereas the extant literature points out the increased management overhead associated with multi-sourcing, the analysis of our case suggests that the required effort for managing global outsourcing projects with multiple vendors depends among other things on the maturation level of the cooperation within the vendor portfolio. Furthermore, our data indicate that this interplay maturity is positively impacted through knowledge about the client that has been derived based on already existing client-vendor relationships. The paper concludes by offering theoretical and practical implications.

  8. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  9. Socio-Cultural Challenges in Global Software Engineering Education

    Science.gov (United States)

    Hoda, Rashina; Babar, Muhammad Ali; Shastri, Yogeshwar; Yaqoob, Humaa

    2017-01-01

    Global software engineering education (GSEE) is aimed at providing software engineering (SE) students with knowledge, skills, and understanding of working in globally distributed arrangements so they can be prepared for the global SE (GSE) paradigm. It is important to understand the challenges involved in GSEE for improving the quality and…

  10. How Social Software Supports Cooperative Practices in a Globally Distributed Software Project

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2014-01-01

    In Global Software Development (GSD), the lack of face- to-face communication is a major challenge and effective computer-mediated practices are necessary. This paper analyzes cooperative practices supported by Social Software (SoSo) in a GSD student project. The empirical results show...... that the role of SoSo is to support informal communication, enabling social talks and metawork, both necessary for establishing and for maintaining effective coordination mechanisms, thus successful cooperation....

  11. Coordination and Control of Globally Distributed Software Projects

    NARCIS (Netherlands)

    P.C. van Fenema (Paul)

    2002-01-01

    textabstractRecently, software development and implementation projects have globalized at a rapid pace. Companies in North America, Europe, and the Far East are beginning to integrate international Information Technology (IT) resources to support operations across the globe. Offshore IT services

  12. Exploring the Role of Social Software in Global Software Development Projects

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Y.

    2011-01-01

    We present a PhD project that investigates the use of Social Software (SoSo) in Global Software Development (GSD) teams. Since SoSo in unstructured and informal in its own nature, we explore how informal communication, which is challenging in GSD, is supported by SoSo in distributed teams and how...

  13. Of deadlocks and peopleware-collaborative work practices in global software development

    OpenAIRE

    Avram, Gabriela

    2007-01-01

    peer-reviewed As part of a research project dedicated to the Social Organizational and Cultural Aspects of Global Software Development, the author has chosen to focus on collaborative work practices and knowledge management aspects of collaborative work. More precisely, the focus is on how the global distribution of software development affects collaborative work. The current paper is a first attempt to unveil, through a concrete situation observed in a distributed software development ...

  14. Supporting Trust in Globally Distributed Software Teams: The Impact of Visualized Collaborative Traces on Perceived Trustworthiness

    Science.gov (United States)

    Trainer, Erik Harrison

    2012-01-01

    Trust plays an important role in collaborations because it creates an environment in which people can openly exchange ideas and information with one another and engineer innovative solutions together with less perceived risk. The rise in globally distributed software development has created an environment in which workers are likely to have less…

  15. Globally distributed software defined storage (proposal)

    Science.gov (United States)

    Shevel, A.; Khoruzhnikov, S.; Grudinin, V.; Sadov, O.; Kairkanov, A.

    2017-10-01

    The volume of the coming data in HEP is growing. The volume of the data to be held for a long time is growing as well. Large volume of data - big data - is distributed around the planet. The methods, approaches how to organize and manage the globally distributed data storage are required. The distributed storage has several examples for personal needs like own-cloud.org, pydio.com, seafile.com, sparkleshare.org. For enterprise-level there is a number of systems: SWIFT - distributed storage systems (part of Openstack), CEPH and the like which are mostly object storage. When several data center’s resources are integrated, the organization of data links becomes very important issue especially if several parallel data links between data centers are used. The situation in data centers and in data links may vary each hour. All that means each part of distributed data storage has to be able to rearrange usage of data links and storage servers in each data center. In addition, for each customer of distributed storage different requirements could appear. The above topics are planned to be discussed in data storage proposal.

  16. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  17. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  18. Is Scrum fit for global software engineering?

    DEFF Research Database (Denmark)

    Lous, Pernille; Kuhrmann, Marco; Tell, Paolo

    2017-01-01

    Distributed software engineering and agility are strongly pushing on today's software industry. Due to inherent incompatibilities, for years, studying Scrum and its application in distributed setups has been subject to theoretical and applied research, and an increasing body of knowledge reports...... insights into this combination. Through a systematic literature review, this paper contributes a collection of experiences on the application of Scrum to global software engineering (GSE). In total, we identified 40 challenges in 19 categories practitioners face when using Scrum in GSE. Among...... the challenges, scaling Scrum to GSE and adopting practices accordingly are the most frequently named. Our findings also show that most solution proposals aim at modifying elements of the Scrum core processes. We thus conclude that, even though Scrum allows for extensive modification, Scrum itself represents...

  19. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    Science.gov (United States)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  20. Getting agile methods to work for Cordys global software product development

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Ligtenberg, Gerwin; Aydin, M.N.; Kotlarsky, J.; Willcocks, L.P.; Oshri, I.

    2011-01-01

    Getting agile methods to work in global software development is a potentially rewarding but challenging task. Agile methods are relatively young and still maturing. The application to globally distributed projects is in its early stages. Various guidelines on how to apply and sometimes adapt agile

  1. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  2. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  3. Global software development

    DEFF Research Database (Denmark)

    Matthiesen, Stina

    2016-01-01

    This overview presents the mid stages of my doctoral research-based on ethnographic work conducted in IT companies in India and in Denmark-on collaborative work within global software development (GSD). In the following I briefly introduce how this research seeks to spark a debate in CSCW...... by challenging contemporary ideals about software development outsourcing through the exploration of the multiplicities and asymmetric dynamics inherent in the collaborative work of GSD....

  4. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  5. Can agile software tools bring the benefits of a task board to globally distributed teams?

    NARCIS (Netherlands)

    Katsma, Christiaan; Amrit, Chintan Amrit; van Hillegersberg, Jos; Sikkel, Nicolaas; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    Software-based tooling has become an essential part of globally disitrbuted software development. In this study we focus on the usage of such tools and task boards in particular. We investigate the deployment of these tools through a field research in 4 different companies that feature agile and

  6. Auto-Erecting Virtual Office Walls : Constructing a Virtual Office for Global Software Engineers

    NARCIS (Netherlands)

    Van Gameren, B.J.A.

    2014-01-01

    Due to the globalization of business and the rising popularity of working from home, global software engineering is becoming increasingly common. In such a distributed environment, team members no longer share a physical work environment and should be provided with information they need to

  7. Teamwork in Distributed Agile Software Development

    OpenAIRE

    Gurram, Chaitanya; Bandi, Srinivas Goud

    2013-01-01

    Context: Distributed software development has become a most desired way of software development. Application of agile development methodologies in distributed environments has taken a new trend in developing software due to its benefits of improved communication and collaboration. Teamwork is an important concept that agile methodologies facilitate and is one of the potential determinants of team performance which was not focused in distributed agile software development. Objectives: This res...

  8. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  9. Recent Topical Research on Global, Energy, Health & Medical, and Tourism Economics, and Global Software

    OpenAIRE

    Chang, Chia-Lin; McAleer, Michael

    2017-01-01

    textabstractThe paper presents an overview of recent topical research on global, energy, health & medical, and tourism economics, and global software. We have interpreted “global” in the title of the Journal of Reviews on Global Economics to cover contributions that have a global impact on economics, thereby making it “global economics”. In this sense, the paper is concerned with papers on global, energy, health & medical, and tourism economics, as well as global software algorithms that have...

  10. Software testing and global industry future paradigms

    CERN Document Server

    Casey, Valentine; Richardson, Ita

    2009-01-01

    Today software development has truly become a globally sourced commodity. This trend has been facilitated by the availability of highly skilled software professionals in low cost locations in Eastern Europe, Latin America and the Far East. Organisations

  11. Fighting Software Piracy: Some Global Conditional Policy Instruments

    OpenAIRE

    Asongu, Simplice A; Singh, Pritam; Le Roux, Sara

    2016-01-01

    This study examines the efficiency of tools for fighting software piracy in the conditional distributions of software piracy. Our paper examines software piracy in 99 countries for the period 1994-2010, using contemporary and non-contemporary quantile regressions. The intuition for modelling distributions contingent on existing levels of software piracy is that the effectiveness of tools against piracy may consistently decrease or increase simultaneously with increasing levels of software pir...

  12. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  13. Recent topical research on global, energy, health & medical, and tourism economics, and global software: An overview

    OpenAIRE

    Chang, Chia-Lin; McAleer, Michael

    2017-01-01

    textabstractThe paper presents an overview of recent topical research on global, energy, health & medical, and tourism economics, and global software. We have interpreted "global" in the title of the Journal of Reviews on Global Economics to cover contributions that have a global impact on economics, thereby making it "global economics". In this sense, the paper is concerned with papers on global, energy, health & medical, and tourism economics, as well as global software algorithms that have...

  14. Knowledge coordination in distributed software management

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars

    2012-01-01

    Software organizations are increasingly relying on cross-organizational and cross-border collaboration, requiring effective coordination of distributed knowledge. However, such coordination is challenging due to spatial separation, diverging communities-of-practice, and unevenly distributed...... communication breakdowns on recordings of their combined teleconferencing and real-time collaborative modeling. As a result, we offer theoretical propositions that explain how distributed software managers can deal with communication breakdowns and effectively coordinate knowledge through multimodal virtual...

  15. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    OpenAIRE

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software ...

  16. Exploring the role of instant messaging in a global software development project

    DEFF Research Database (Denmark)

    Dittrich, Y.; Giuffrida, Rosalba

    2011-01-01

    Communication plays a vital role in software devel- opment projects. Globally distributed teams use a mix of dif- ferent communication channels to get the work done. In this paper, we report on an empirical study of a team distributed across Denmark and India. This paper explores the integration...... documentation. Our analysis provides an indication that IM can play a special role in such socio-technical communication systems: IM acts as a real time glue between different chan- nels. The communication through IM also provides a means to build trust and social relationships with co-workers....

  17. Architecture design in global and model-centric software development

    NARCIS (Netherlands)

    Heijstek, Werner

    2012-01-01

    This doctoral dissertation describes a series of empirical investigations into representation, dissemination and coordination of software architecture design in the context of global software development. A particular focus is placed on model-centric and model-driven software development.

  18. Designing Project Management for Global Software Development

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; B. Balogh, Maria; Iversen, Cathrine

    2014-01-01

    of distributed software teams, based on a practice study and informed by well-known theories. Our work pinpoints the difficulties of handling the vital informal processes in distributed collaboration that are so vulnerable because the distances risk detaining their growth and increasing their decay rate......Software development in distributed teams remains challenging despite rapid technical improvement in tools for communication and collaboration across distance. The challenges stem from geographical, temporal and sociocultural distance and manifest themselves in a variety of difficulties...

  19. A Reference Architecture for Providing Tools as a Service to Support Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef

    2014-01-01

    -computing paradigm for addressing above-mentioned issues by providing a framework to select appropriate tools as well as associated services and reference architecture of the cloud-enabled middleware platform that allows on demand provisioning of software engineering Tools as a Service (TaaS) with focus......Global Software Development (GSD) teams encounter challenges that are associated with distribution of software development activities across multiple geographic regions. The limited support for performing collaborative development and engineering activities and lack of sufficient support......-based solutions. The restricted ability of the organizations to have desired alignment of tools with software engineering and development processes results in administrative and managerial overhead that incur increased development cost and poor product quality. Moreover, stakeholders involved in the projects have...

  20. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  1. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  2. Recent Topical Research on Global, Energy, Health & Medical, and Tourism Economics, and Global Software

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); M.J. McAleer (Michael)

    2017-01-01

    textabstractThe paper presents an overview of recent topical research on global, energy, health & medical, and tourism economics, and global software. We have interpreted “global” in the title of the Journal of Reviews on Global Economics to cover contributions that have a global impact on

  3. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  4. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  5. Effects of Individual Success on Globally Distributed Team Performance

    OpenAIRE

    Yılmaz, Onur

    2013-01-01

    Necessity of different competencies with high level of knowledge makes it inevitable that software development is a team work. With the today's technology, teams can communicate both synchronously and asynchronously using different online collaboration tools throughout the world. Researches indicate that there are many factors that affect the team success and in this paper, effect of individual success on globally distributed team performance will be analyzed. Student team projects undertaken...

  6. Empirical Studies on the Use of Social Software in Global Software Development - a Systematic Mapping Study

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2013-01-01

    of empirical studies on the usage of SoSo are available in related fields, there exists no comprehensive overview of what has been investigated to date across them. Objective: The aim of this review is to map empirical studies on the usage of SoSo in Software Engineering projects and in distributed teams...... for collaborative work, fostering awareness, knowledge management and coordination among team members. Contrary to the evident high importance of the social aspects offered by SoSo, socialization is not the most important usage reported. Conclusions: This review reports how SoSo is used in GSD and how it is capable...... of supporting GSD teams. Four emerging themes in global software engineering were identified: the appropriation and development of usage structures; understanding how an ecology of communication channels and tools are used by teams; the role played by SoSo either as a subtext or as an explicit goal; and finally...

  7. Revisiting the Global Software Engineering Terminology

    DEFF Research Database (Denmark)

    Tell, Paolo; Giuffrida, Rosalba; Shah, Hina

    2013-01-01

    Even though Global Software Engineering (GSE) has been a research topic of interest for many years, some of its ground terminology is still lacking a unified, coherent, and shared definition and/or classification. The purpose of this report is to collect, outline, and relate several fundamental...

  8. Software Distribution Statement and Disclaimer | OSTI, US Dept of Energy

    Science.gov (United States)

    Search Search Software Distribution Statement and Disclaimer Rights-in-technical-data clauses for many . The following distribution statement and disclaimer meet those requirements for software and should be affixed to all distributed DOE-sponsored software. Contractors may have specific requirements and required

  9. Empowering global software development with business intelligence

    OpenAIRE

    Maté Morga, Alejandro; Trujillo Mondéjar, Juan Carlos; García, Félix; Serrano Martín, Manuel; Piattini, Mario

    2016-01-01

    Context: Global Software Development (GSD) allows companies to take advantage of talent spread across the world. Most research has been focused on the development aspect. However, little if any attention has been paid to the management of GSD projects. Studies report a lack of adequate support for management’s decisions made during software development, further accentuated in GSD since information is scattered throughout multiple factories, stored in different formats and standards. Objective...

  10. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  11. Recent topical research on global, energy, health & medical, and tourism economics, and global software: An overview

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); M.J. McAleer (Michael)

    2017-01-01

    textabstractThe paper presents an overview of recent topical research on global, energy, health & medical, and tourism economics, and global software. We have interpreted "global" in the title of the Journal of Reviews on Global Economics to cover contributions that have a global impact on

  12. The Use of Kanban to Alleviate Collaboration and Communication Challenges of Global Software Development

    Directory of Open Access Journals (Sweden)

    Maureen Tanner

    2017-05-01

    Full Text Available Aim/Purpose: This paper aims to describe how various Kanban elements can help alleviate two prominent types of challenges, communication and collaboration in Global Software Development (GSD. Background: Iterative and Lean development methodologies like Kanban have gained significance in the software development industry, both in the co-located and globally distributed contexts. However, little is known on how such methodologies can help mitigate various challenges in that occur in a globally distributed software development context. Methodology: The study was conducted using a single-case study based on a general inductive approach to analysis and theory development. Through the literature review, collaboration and communication challenges that GSD teams face were identified. Data collected through semi-structured interviews was then inductively analyzed to describe how the case-study teams employed various Kanban elements to mitigate communication and collaboration challenges they face during GSD. Findings: The study found that some Kanban elements, when properly employed, can help alleviate collaboration and communication challenges that occur within GSD teams. These relate to Inclusion Criteria, Reverse Items, Kanban Board, Policies, Avatars, and Backlog. Contribution: The paper contributes to knowledge by proposing two simple concept maps that detail the specific types of communication and collaboration challenges which can be alleviated by the aforementioned Kanban elements in GSD. Recommendations for Practitioners: This paper is relevant to GSD teams who are seeking ways to enhance their team collaboration and communication as these are the most important elements that contribute to GSD project success. It is recommended that relevant Kanban elements be used to that effect, depending on the challenges that they aim to alleviate. Future Research: Future research can investigate the same research questions (or similar ones using a

  13. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    Directory of Open Access Journals (Sweden)

    Abdulaziz Alsahli

    2016-01-01

    Full Text Available Requirement change management (RCM is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD.The objective of this research is to introduce an innovative approach for handling requirements and architecture changes simultaneously during global software development. The approach makes use of Case-Based Reasoning (CBR and agile practices. Agile practices make our approach iterative, whereas CBR stores requirements and makes them reusable. Twin Peaks is our base model, meaning that requirements and architecture are handled simultaneously. For this research, grounded theory has been applied; similarly, interviews from domain experts were conducted. Interview and literature transcripts formed the basis of data collection in grounded theory. Physical saturation of theory has been achieved through a published case study and developed tool. Expert reviews and statistical analysis have been used for evaluation. The proposed approach resulted in effective change management of requirements and architecture simultaneously during global software development.

  14. Why Closely Coupled Work Matters in Global Software Development

    DEFF Research Database (Denmark)

    Jensen, Rasmus Eskild

    2014-01-01

    We report on an ethnographic study of an offshore global software development project between Danish and Philippine developers in a Danish company called GlobalSoft. We investigate why the IT- developers chose to engage in more closely coupled work as the project progressed and argue that closely...

  15. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  16. Using Software Architectures for Designing Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    In this paper, we outline an on-going project of designing distributed embedded systems for closed-loop process control. The project is a joint effort between software architecture researchers and developers from two companies that produce commercial embedded process control systems. The project...... has a strong emphasis on software architectural issues and terminology in order to envision, design and analyze design alternatives. We present two results. First, we outline how focusing on software architecture, architectural issues and qualities are beneficial in designing distributed, embedded......, systems. Second, we present two different architectures for closed-loop process control and discuss benefits and reliabilities....

  17. Software Image J to study soil pore distribution

    Directory of Open Access Journals (Sweden)

    Sabrina Passoni

    2014-04-01

    Full Text Available In the soil science, a direct method that allows the study of soil pore distribution is the bi-dimensional (2D digital image analysis. Such technique provides quantitative results of soil pore shape, number and size. The use of specific softwares for the treatment and processing of images allows a fast and efficient method to quantify the soil porous system. However, due to the high cost of commercial softwares, public ones can be an interesting alternative for soil structure analysis. The objective of this work was to evaluate the quality of data provided by the Image J software (public domain used to characterize the voids of two soils, characterized as Geric Ferralsol and Rhodic Ferralsol, from the southeast region of Brazil. The pore distribution analysis technique from impregnated soil blocks was utilized for this purpose. The 2D image acquisition was carried out by using a CCD camera coupled to a conventional optical microscope. After acquisition and treatment of images, they were processed and analyzed by the software Noesis Visilog 5.4® (chosen as the reference program and ImageJ. The parameters chosen to characterize the soil voids were: shape, number and pore size distribution. For both soils, the results obtained for the image total porosity (%, the total number of pores and the pore size distribution showed that the Image J is a suitable software to be applied in the characterization of the soil sample voids impregnated with resin.

  18. The design of a real-time software system for the distributed control of power station plant

    International Nuclear Information System (INIS)

    Maples, G.C.

    1980-01-01

    As the application of computers to the control of generating plants widens, the problems of resourcing several individual projects over their life cycle can become formidable. This paper indicates the factors relevant to containing the resource requirements associated with software, and outlines the benefits of adopting a standard machine-independent software system which enables engineers rather than computer specialists to develop programs for specific projects. The design objectives which have led to the current development within C.E.G.B. of CUTLASS (Computer Users Technical Languages and Applications Software System) are then considered. CUTLASS is intended to be a standard software system applicable to the majority of future on-line computing projects in the area of generation and is appropriate to stand alone schemes or distributed schemes having a host/target configuration. The CUTLASS system software provides the necessary environment in which to develop, test, and run the applications software, the latter being created by the user by means of a set of engineer-orientated languages. The paper describes the various facilities within CUTALSS, i.e. those considered essential to meet the requirements of future process control applications. Concentrating on the system software relating to the executive functions, and the organisation of global data and communications within distributed systems. The salient features of the engineer-orientated language sets are also discussed. (auth)

  19. Toward an Agile Approach to Managing the Effect of Requirements on Software Architecture during Global Software Development

    OpenAIRE

    Alsahli, Abdulaziz; Khan, Hameed; Alyahya, Sultan

    2016-01-01

    Requirement change management (RCM) is a critical activity during software development because poor RCM results in occurrence of defects, thereby resulting in software failure. To achieve RCM, efficient impact analysis is mandatory. A common repository is a good approach to maintain changed requirements, reusing and reducing effort. Thus, a better approach is needed to tailor knowledge for better change management of requirements and architecture during global software development (GSD).The o...

  20. Collaboration in Global Software Engineering Based on Process Description Integration

    Science.gov (United States)

    Klein, Harald; Rausch, Andreas; Fischer, Edward

    Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.

  1. The role of original equipment manufacturers in software distribution

    Directory of Open Access Journals (Sweden)

    Herţanu, A.

    2012-01-01

    Full Text Available The software distribution channels are having a significant impact on the mix of marketing not only for big companies in this domain, but also for small companies that activate in this domain. The Original Equipment Manufacturer’s distribution channel it’s having a significant impact on the marketing strategy of different companies. If the traditional distribution channels are still used to, the OEM’s channels are used more and more to distribute the software products or services not only to the segment of consumers formed by companies, but also to the segment of costumers formed by individual users.

  2. Global Landslide Hazard Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Landslide Hazard Distribution is a 2.5 minute grid of global landslide and snow avalanche hazards based upon work of the Norwegian Geotechnical Institute...

  3. A Case Study of Coordination in Distributed Agile Software Development

    Science.gov (United States)

    Hole, Steinar; Moe, Nils Brede

    Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.

  4. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    Science.gov (United States)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  5. Laboratory and software applications for clinical trials: the global laboratory environment.

    Science.gov (United States)

    Briscoe, Chad

    2011-11-01

    The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.

  6. Distributed controller clustering in software defined networks.

    Directory of Open Access Journals (Sweden)

    Ahmed Abdelaziz

    Full Text Available Software Defined Networking (SDN is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN SDN and Open Network Operating System (ONOS controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  7. Distributed controller clustering in software defined networks.

    Science.gov (United States)

    Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  8. Software-Based Challenges of Developing the Future Distribution Grid

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Emma; Kiliccote, Sila; McParland, Charles

    2014-06-01

    The software that the utility industry currently uses may be insufficient to analyze the distribution grid as it rapidly modernizes to include active resources such as distributed generation, switch and voltage control, automation, and increasingly complex loads. Although planners and operators have traditionally viewed the distribution grid as a passive load, utilities and consultants increasingly need enhanced analysis that incorporates active distribution grid loads in order to ensure grid reliability. Numerous commercial and open-source tools are available for analyzing distribution grid systems. These tools vary in complexity from providing basic load-flow and capacity analysis under steady-state conditions to time-series analysis and even geographical representations of dynamic and transient events. The need for each type of analysis is not well understood in the industry, nor are the reasons that distribution analysis requires different techniques and tools both from those now available and from those used for transmission analysis. In addition, there is limited understanding of basic capability of the tools and how they should be practically applied to the evolving distribution system. The study reviews the features and state of the art capability of current tools, including usability and visualization, basic analysis functionality, advanced analysis including inverters, and renewable generation and load modeling. We also discuss the need for each type of distribution grid system analysis. In addition to reviewing basic functionality current models, we discuss dynamics and transient simulation in detail and draw conclusions about existing software?s ability to address the needs of the future distribution grid as well as the barriers to modernization of the distribution grid that are posed by the current state of software and model development. Among our conclusions are that accuracy, data transfer, and data processing abilities are key to future

  9. Methods of Run-Time Error Detection in Distributed Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.

    In this thesis, methods of run-time error detection in application software for distributed process control is designed. The error detection is based upon a monitoring approach in which application software is monitored by system software during the entire execution. The thesis includes definition...... and constraint evaluation is designed for the modt interesting error types. These include: a) semantical errors in data communicated between application tasks; b) errors in the execution of application tasks; and c) errors in the timing of distributed events emitted by the application software. The design...... of error detection methods includes a high level software specification. this has the purpose of illustrating that the designed can be used in practice....

  10. Global Software Innovators Strengthening the Software Innovation Capacity of Europe and Korea

    OpenAIRE

    Lillis, Deirdre; Doyle, Paul; Collins, Michael; Keegan, Brian; Longo, Luca; O'Mahony, William; Manifold, Peter

    2017-01-01

    Global entrepreneurial talent management is a key challenge for the software sector internationally where competition for high-end skills is intense. SMEs are at a significant disadvantage when competing with major multinationals to access these skills. The Information and Communications Technology sector accounts for 5% of all employment in the EU and there are 900,000 vacancies in this sector in 2017 [1], however over 50% of senior ICT managers believe graduates lack the necessary combinati...

  11. Global Mangrove Forests Distribution, 2000

    Data.gov (United States)

    National Aeronautics and Space Administration — The Global Mangrove Forests Distribution, 2000 data set is a compilation of the extent of mangroves forests from the Global Land Survey and the Landsat archive with...

  12. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    Science.gov (United States)

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  13. Monitoring extensions for component-based distributed software

    NARCIS (Netherlands)

    Diakov, N.K.; Papir, Z.; van Sinderen, Marten J.; Quartel, Dick

    2000-01-01

    This paper defines a generic class of monitoring extensions to component-based distributed enterprise software. Introducing a monitoring extension to a legacy application system can be very costly. In this paper, we identify the minimum support for application monitoring within the generic

  14. Coordinating Management Activities in Distributed Software Development Projects

    OpenAIRE

    Bendeck, Fawsy; Goldmann, Sigrid; Holz, Harald; Kötting, Boris

    1999-01-01

    Coordinating distributed processes, especially engineering and software design processes, has been a research topic for some time now. Several approaches have been published that aim at coordinating large projects in general, and large software development processes in specific. However, most of these approaches focus on the technical part of the design process and omit management activities like planning and scheduling the project, or monitoring it during execution. In this paper, we focus o...

  15. Toward a User Driven Innovation for Distributed Software Teams

    Science.gov (United States)

    Hossain, Liaquat; Zhou, David

    The software industry has emerged to include some of the most revolutionized distributed work groups; however, not all such groups achieve their set goals and some even fail miserably. The distributed nature of open source software project teams provides an intriguing context for the study of distributed coordination. OSS team structures have traditionally been geographically dispersed and, therefore, the coordination of post-release activities such as testing are made difficult due to the fact that the only means of communication is via electronic forms, such as e-mail or message boards and forums. Nevertheless, large scale, complex, and innovative software packages have been the fruits of labor for some OSS teams set in such coordination-unfriendly environments, while others end in flames. Why are some distributed work groups more effective than others? In our current communication-enriched environment, best practices for coordination are adopted by all software projects yet some still fall by the wayside. Does the team structure have bearing on the success of the project? How does the communication between the team and external parties affect the project's ultimate success or failure? In this study, we seek to answer these questions by applying existing theories from social networks and their analytical methods in the coordination of defect management activities found in OSS projects. We propose the social network based theoretical model for exploring distributed coordination structures and apply that for the case of the OSS defect management process for exploring the structural properties, which induce the greatest coordination performance. The outcome suggests that there is correlation between certain network measures such as density, centrality, and betweenness and coordination performance measures of defect management systems such as quality and timeliness.

  16. Software defined networking applications in distributed datacenters

    CERN Document Server

    Qi, Heng

    2016-01-01

    This SpringerBrief provides essential insights on the SDN application designing and deployment in distributed datacenters. In this book, three key problems are discussed: SDN application designing, SDN deployment and SDN management. This book demonstrates how to design the SDN-based request allocation application in distributed datacenters. It also presents solutions for SDN controller placement to deploy SDN in distributed datacenters. Finally, an SDN management system is proposed to guarantee the performance of datacenter networks which are covered and controlled by many heterogeneous controllers. Researchers and practitioners alike will find this book a valuable resource for further study on Software Defined Networking. .

  17. Software Comparison for Renewable Energy Deployment in a Distribution Network

    Energy Technology Data Exchange (ETDEWEB)

    Gao, David Wenzhong [Alternative Power Innovations, LLC, Sharonville, OH (United States); Muljadi, Eduard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tian, Tian [National Renewable Energy Lab. (NREL), Golden, CO (United States); Miller, Mackay [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-02-22

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercial tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.

  18. Methods of Run-Time Error Detection in Distributed Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.

    of generic run-time error types, design of methods of observing application software behaviorduring execution and design of methods of evaluating run time constraints. In the definition of error types it is attempted to cover all relevant aspects of the application softwaree behavior. Methods of observation......In this thesis, methods of run-time error detection in application software for distributed process control is designed. The error detection is based upon a monitoring approach in which application software is monitored by system software during the entire execution. The thesis includes definition...... and constraint evaluation is designed for the modt interesting error types. These include: a) semantical errors in data communicated between application tasks; b) errors in the execution of application tasks; and c) errors in the timing of distributed events emitted by the application software. The design...

  19. Some software issues in mapping of power distribution feeders

    International Nuclear Information System (INIS)

    Mufti, I.A.

    1994-01-01

    This paper is about the in-house developed software for distribution feeders mapping project. It first gives birds eye view of the project, highlight its technical complexity in management and logistics, introduced by sheer size of the project. It gives an overview of the software developed and the moves on to describe circuit tracing, circuit model, leaves isolation (for tree structured network) and backtracking in more detail among many different parts of software, description of all which is not possible because of space limitations. (author)

  20. Hardware-assisted software clock synchronization for homogeneous distributed systems

    Science.gov (United States)

    Ramanathan, P.; Kandlur, Dilip D.; Shin, Kang G.

    1990-01-01

    A clock synchronization scheme that strikes a balance between hardware and software solutions is proposed. The proposed is a software algorithm that uses minimal additional hardware to achieve reasonably tight synchronization. Unlike other software solutions, the guaranteed worst-case skews can be made insensitive to the maximum variation of message transit delay in the system. The scheme is particularly suitable for large partially connected distributed systems with topologies that support simple point-to-point broadcast algorithms. Examples of such topologies include the hypercube and the mesh interconnection structures.

  1. Real-time Control Mediation in Agile Distributed Software Development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Aaen, Ivan; Mathiassen, Lars

    2008-01-01

    Agile distributed environments pose particular challenges related to control of quality and collaboration in software development. Moreover, while face-to-face interaction is fundamental in agile development, distributed environments must rely extensively on mediated interactions. On this backdrop...... control was mediated over distance by technology through real-time exchanges. Contrary to previous research, the analysis suggests that both formal and informal elements of real-time mediated control were used; that evolving goals and adjustment of expectations were two of the main issues in real......-time mediated control exchanges; and, that the actors, despite distances in space and culture, developed a clan-like pattern mediated by technology to help control quality and collaboration in software development....

  2. Global Volcano Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Volcano Mortality Risks and Distribution is a 2.5 minute grid representing global volcano mortality risks. The data set was constructed using historical...

  3. Climate Science's Globally Distributed Infrastructure

    Science.gov (United States)

    Williams, D. N.

    2016-12-01

    The Earth System Grid Federation (ESGF) is primarily funded by the Department of Energy's (DOE's) Office of Science (the Office of Biological and Environmental Research [BER] Climate Data Informatics Program and the Office of Advanced Scientific Computing Research Next Generation Network for Science Program), the National Oceanic and Atmospheric Administration (NOAA), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF), the European Infrastructure for the European Network for Earth System Modeling (IS-ENES), and the Australian National University (ANU). Support also comes from other U.S. federal and international agencies. The federation works across multiple worldwide data centers and spans seven international network organizations to provide users with the ability to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a series of geographically distributed peer nodes that are independently administered and united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP; output used by the Intergovernmental Panel on Climate Change assessment reports), multiple model intercomparison projects (MIPs; endorsed by the World Climate Research Programme [WCRP]), and the Accelerated Climate Modeling for Energy (ACME; ESGF is included in the overarching ACME workflow process to store model output). ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs the global climate science community. Data served by ESGF includes not only model output but also observational data from satellites and instruments, reanalysis, and generated images.

  4. Global Multihazard Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Multihazard Mortality Risks and Distribution is a 2.5 minute grid identifying and characterizing the nature of multihazard risk at the global scale. For this...

  5. Using the cloud to facilitate global software development challenges

    NARCIS (Netherlands)

    Hashimi, S.; Clerc, V.; Razavian, M.; Manteli, C.; Lago, P.; Di Nitto, E.; Richardson, I.; Tamburri, D.A.

    2011-01-01

    With the expansion of national markets beyond geographical limits, success of any business often depends on using software for competitive advantage. Furthermore, as technological boundaries are expanding, projects distributed across different geographical locations have become a norm for the

  6. Global Drought Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Drought Mortality Risks and Distribution is a 2.5 minute grid of global drought mortality risks. Gridded Population of the World, Version 3 (GPWv3) data...

  7. Atlas of the global distribution of atmospheric heating during the global weather experiment

    Science.gov (United States)

    Schaack, Todd K.; Johnson, Donald R.

    1991-01-01

    Global distributions of atmospheric heating for the annual cycle of the Global Weather Experiment are estimated from the European Centre for Medium-Range Weather Forecasts (ECMWF) Level 3b data set. Distributions of monthly, seasonally, and annually averaged heating are presented for isentropic and isobaric layers within the troposphere and for the troposphere as a whole. The distributions depict a large-scale structure of atmospheric heating that appears spatially and temporally consistent with known features of the global circulation and the seasonal evolution.

  8. Distributed caching mechanism for various MPE software services

    CERN Document Server

    Svec, Andrej

    2017-01-01

    The MPE Software Section provides multiple software services to facilitate the testing and the operation of the CERN Accelerator complex. Continuous growth in the number of users and the amount of processed data result in the requirement of high scalability. Our current priority is to move towards a distributed and properly load balanced set of services based on containers. The aim of this project is to implement the generic caching mechanism applicable to our services and chosen architecture. The project will at first require research about the different aspects of distributed caching (persistence, no gc-caching, cache consistency etc.) and the available technologies followed by the implementation of the chosen solution. In order to validate the correctness and performance of the implementation in the last phase of the project it will be required to implement a monitoring layer and integrate it with the current ELK stack.

  9. Global review of open access risk assessment software packages valid for global or continental scale analysis

    Science.gov (United States)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user

  10. Understanding How the "Open" of Open Source Software (OSS) Will Improve Global Health Security.

    Science.gov (United States)

    Hahn, Erin; Blazes, David; Lewis, Sheri

    2016-01-01

    Improving global health security will require bold action in all corners of the world, particularly in developing settings, where poverty often contributes to an increase in emerging infectious diseases. In order to mitigate the impact of emerging pandemic threats, enhanced disease surveillance is needed to improve early detection and rapid response to outbreaks. However, the technology to facilitate this surveillance is often unattainable because of high costs, software and hardware maintenance needs, limited technical competence among public health officials, and internet connectivity challenges experienced in the field. One potential solution is to leverage open source software, a concept that is unfortunately often misunderstood. This article describes the principles and characteristics of open source software and how it may be applied to solve global health security challenges.

  11. Trust in agile teams in distributed software development

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Fransgård, Mette; Skalkam, Signe

    2012-01-01

    Distributed software development (DSD) is becoming everyday practice in the software market. Difficult challenges and difficulty reaching the expected benefits are well documented. Recently agile software development has become common in DSD, even though important incompatibilities between...... that leads to team success. This article reports from a study of two agile DSD teams with very different organization and collaboration patterns. It addresses the role of trust and distrust in DSD by analyzing how the team members’ trust developed and erode through the lifetime of the two collaborations...... and how management actions influenced this. We find that some agile practice can empower teams to take over responsibility for managing their own trust building and sustaining and that management neglect of trust-building in other situations can hinder the development of beneficial balanced agile DSD...

  12. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  13. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  14. The IceCube Data Acquisition Software: Lessons Learned during Distributed, Collaborative, Multi-Disciplined Software Development.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Keith S; Beattie, Keith; Day Ph.D., Christopher; Glowacki, Dave; Hanson Ph.D., Kael; Jacobsen Ph.D., John; McParland, Charles; Patton Ph.D., Simon

    2007-09-21

    In this experiential paper we report on lessons learned during the development ofthe data acquisition software for the IceCube project - specifically, how to effectively address the unique challenges presented by a distributed, collaborative, multi-institutional, multi-disciplined project such as this. While development progress in software projects is often described solely in terms of technical issues, our experience indicates that non- and quasi-technical interactions play a substantial role in the effectiveness of large software development efforts. These include: selection and management of multiple software development methodologies, the effective useof various collaborative communication tools, project management structure and roles, and the impact and apparent importance of these elements when viewed through the differing perspectives of hardware, software, scientific and project office roles. Even in areas clearly technical in nature, success is still influenced by non-technical issues that can escape close attention. In particular we describe our experiences on software requirements specification, development methodologies and communication tools. We make observations on what tools and techniques have and have not been effective in this geographically disperse (including the South Pole) collaboration and offer suggestions on how similarly structured future projects may build upon our experiences.

  15. The ALMA Common Software as a Basis for a Distributed Software Development

    Science.gov (United States)

    Raffi, Gianni; Chiozzi, Gianluca; Glendenning, Brian

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe, North America and Japan. ALMA will consist of 64 12-m antennas operating in the millimetre and sub-millimetre wavelength range, with baselines of more than 10 km. It will be located at an altitude above 5000 m in the Chilean Atacama desert. The ALMA Computing group is a joint group with staff scattered on 3 continents and is responsible for all the control and data flow software related to ALMA, including tools ranging from support of proposal preparation to archive access of automatically created images. Early in the project it was decided that an ALMA Common Software (ACS) would be developed as a way to provide to all partners involved in the development a common software platform. The original assumption was that some key middleware like communication via CORBA and the use of XML and Java would be part of the project. It was intended from the beginning to develop this software in an incremental way based on releases, so that it would then evolve into an essential embedded part of all ALMA software applications. In this way we would build a basic unity and coherence into a system that will have been developed in a distributed fashion. This paper evaluates our progress after 1.5 year of work, following a few tests and preliminary releases. It analyzes the advantages and difficulties of such an ambitious approach, which creates an interface across all the various control and data flow applications.

  16. Distributed inter process communication framework of BES III DAQ online software

    International Nuclear Information System (INIS)

    Li Fei; Liu Yingjie; Ren Zhenyu; Wang Liang; Chinese Academy of Sciences, Beijing; Chen Mali; Zhu Kejun; Zhao Jingwei

    2006-01-01

    DAQ (Data Acquisition) system is one important part of BES III, which is the large scale high-energy physics detector on the BEPC. The inter process communication (IPC) of online software in distributed environments is very pivotal for design and implement of DAQ system. This article will introduce one distributed inter process communication framework, which is based on CORBA and used in BES III DAQ online software. The article mainly presents the design and implementation of the IPC framework and application based on IPC. (authors)

  17. BEANS - a software package for distributed Big Data analysis

    Science.gov (United States)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  18. The use of software agents and distributed objects to integrate enterprises: Compatible or competing technologies?

    Energy Technology Data Exchange (ETDEWEB)

    Pancerella, C.M.

    1998-04-01

    Distributed object and software agent technologies are two integration methods for connecting enterprises. The two technologies have overlapping goals--interoperability and architectural support for integrating software components--though to date little or no integration of the two technologies has been made at the enterprise level. The primary difference between these two technologies is that distributed object technologies focus on the problems inherent in connecting distributed heterogeneous systems whereas software agent technologies focus on the problems involved with coordination and knowledge exchange across domain boundaries. This paper addresses the integration of these technologies in support of enterprise integration across organizational and geographic boundaries. The authors discuss enterprise integration issues, review their experiences with both technologies, and make recommendations for future work. Neither technology is a panacea. Good software engineering techniques must be applied to integrate an enterprise because scalability and a distributed software development team are realities.

  19. The Software Architecture of Global Climate Models

    Science.gov (United States)

    Alexander, K. A.; Easterbrook, S. M.

    2011-12-01

    It has become common to compare and contrast the output of multiple global climate models (GCMs), such as in the Climate Model Intercomparison Project Phase 5 (CMIP5). However, intercomparisons of the software architecture of GCMs are almost nonexistent. In this qualitative study of seven GCMs from Canada, the United States, and Europe, we attempt to fill this gap in research. We describe the various representations of the climate system as computer programs, and account for architectural differences between models. Most GCMs now practice component-based software engineering, where Earth system components (such as the atmosphere or land surface) are present as highly encapsulated sub-models. This architecture facilitates a mix-and-match approach to climate modelling that allows for convenient sharing of model components between institutions, but it also leads to difficulty when choosing where to draw the lines between systems that are not encapsulated in the real world, such as sea ice. We also examine different styles of couplers in GCMs, which manage interaction and data flow between components. Finally, we pay particular attention to the varying levels of complexity in GCMs, both between and within models. Many GCMs have some components that are significantly more complex than others, a phenomenon which can be explained by the respective institution's research goals as well as the origin of the model components. In conclusion, although some features of software architecture have been adopted by every GCM we examined, other features show a wide range of different design choices and strategies. These architectural differences may provide new insights into variability and spread between models.

  20. Using E-markets for Globally Distributed Work

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Amrit, Chintan Amrit; Oshri, Ilan; Kotlarsky, Julia; Willcocks, Leslie P.

    2015-01-01

    For over a decade, dedicated E-markets have been facilitating globally distributed systems development by enhancing the traditionally high-risk global sourcing processes. At the same time, the success and potential of E-markets for sourcing project globally can be questioned, as E-markets embody a

  1. Distributed software framework and continuous integration in hydroinformatics systems

    Science.gov (United States)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  2. A combined Component-Based Approach for the Design of Distributed Software Systems

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Ferreira Pires, Luis; van Sinderen, Marten J.; Quartel, Dick; Yang, H.; Gupta, S.

    2001-01-01

    Component-based software development enables the construction of software artefacts by assembling binary units of production, distribution and deployment, the so-called components. Several approaches to component-based development have been proposed recently. Most of these approaches are based on

  3. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  4. Firm Size Distribution in Fortune Global 500

    Science.gov (United States)

    Chen, Qinghua; Chen, Liujun; Liu, Kai

    By analyzing the data of Fortune Global 500 firms from 1996 to 2008, we found that their ranks and revenues always obey the same distribution, which implies that worldwide firm structure has been stable for a long time. The fitting results show that simple Zipf distribution is not an ideal model for global firms, while SCL, FSS have better fitting goodness, and lognormal fitting is the best. And then, we proposed a simple explanation.

  5. Software/hardware distributed processing network supporting the Ada environment

    Science.gov (United States)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  6. Revised spatially distributed global livestock emissions

    Science.gov (United States)

    Asrar, G.; Wolf, J.; West, T. O.

    2015-12-01

    Livestock play an important role in agricultural carbon cycling through consumption of biomass and emissions of methane. Quantification and spatial distribution of methane and carbon dioxide produced by livestock is needed to develop bottom-up estimates for carbon monitoring. These estimates serve as stand-alone international emissions estimates, as input to global emissions modeling, and as comparisons or constraints to flux estimates from atmospheric inversion models. Recent results for the US suggest that the 2006 IPCC default coefficients may underestimate livestock methane emissions. In this project, revised coefficients were calculated for cattle and swine in all global regions, based on reported changes in body mass, quality and quantity of feed, milk production, and management of living animals and manure for these regions. New estimates of livestock methane and carbon dioxide emissions were calculated using the revised coefficients and global livestock population data. Spatial distribution of population data and associated fluxes was conducted using the MODIS Land Cover Type 5, version 5.1 (i.e. MCD12Q1 data product), and a previously published downscaling algorithm for reconciling inventory and satellite-based land cover data at 0.05 degree resolution. Preliminary results for 2013 indicate greater emissions than those calculated using the IPCC 2006 coefficients. Global total enteric fermentation methane increased by 6%, while manure management methane increased by 38%, with variation among species and regions resulting in improved spatial distributions of livestock emissions. These new estimates of total livestock methane are comparable to other recently reported studies for the entire US and the State of California. These new regional/global estimates will improve the ability to reconcile top-down and bottom-up estimates of methane production as well as provide updated global estimates for use in development and evaluation of Earth system models.

  7. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  8. Project Management Software for Distributed Industrial Companies

    Science.gov (United States)

    Dobrojević, M.; Medjo, B.; Rakin, M.; Sedmak, A.

    This paper gives an overview of the development of a new software solution for project management, intended mainly to use in industrial environment. The main concern of the proposed solution is application in everyday engineering practice in various, mainly distributed industrial companies. Having this in mind, special care has been devoted to development of appropriate tools for tracking, storing and analysis of the information about the project, and in-time delivering to the right team members or other responsible persons. The proposed solution is Internet-based and uses LAMP/WAMP (Linux or Windows - Apache - MySQL - PHP) platform, because of its stability, versatility, open source technology and simple maintenance. Modular structure of the software makes it easy for customization according to client specific needs, with a very short implementation period. Its main advantages are simple usage, quick implementation, easy system maintenance, short training and only basic computer skills needed for operators.

  9. Minimizing communication cost among distributed controllers in software defined networks

    Science.gov (United States)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  10. Software-Enabled Distributed Network Governance: The PopMedNet Experience.

    Science.gov (United States)

    Davies, Melanie; Erickson, Kyle; Wyner, Zachary; Malenfant, Jessica; Rosen, Rob; Brown, Jeffrey

    2016-01-01

    The expanded availability of electronic health information has led to increased interest in distributed health data research networks. The distributed research network model leaves data with and under the control of the data holder. Data holders, network coordinating centers, and researchers have distinct needs and challenges within this model. The concerns of network stakeholders are addressed in the design and governance models of the PopMedNet software platform. PopMedNet features include distributed querying, customizable workflows, and auditing and search capabilities. Its flexible role-based access control system enables the enforcement of varying governance policies. Four case studies describe how PopMedNet is used to enforce network governance models. Trust is an essential component of a distributed research network and must be built before data partners may be willing to participate further. The complexity of the PopMedNet system must be managed as networks grow and new data, analytic methods, and querying approaches are developed. The PopMedNet software platform supports a variety of network structures, governance models, and research activities through customizable features designed to meet the needs of network stakeholders.

  11. Optimizing Distribution Problems using WinQSB Software

    Directory of Open Access Journals (Sweden)

    Daniel Mihai Amariei

    2015-07-01

    Full Text Available In the present paper we are presenting a problem of distribution using the Network Modeling Module of the WinQSB software, were we have 5 athletes which we must assign the optimal sample, function of the obtained time, so as to obtain the maximum output of the athletes. Also we analyzed the case of an accident of 2 athletes, the coupling of 3 athletes with 5 various athletic events causing the maximum coupling, done using the Hungarian algorithm.

  12. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  13. An IMRT dose distribution study using commercial verification software

    International Nuclear Information System (INIS)

    Grace, M.; Liu, G.; Fernando, W.; Rykers, K.

    2004-01-01

    Full text: The introduction of IMRT requires users to confirm that the isodose distributions and relative doses calculated by their planning system match the doses delivered by their linear accelerators. To this end the commercially available software, VeriSoft TM (PTW-Freiburg, Germany) was trialled to determine if the tools and functions it offered would be of benefit to this process. The CMS Xio (Computer Medical System) treatment planning system was used to generate IMRT plans that were delivered with an upgraded Elekta SL15 linac. Kodak EDR2 film sandwiched in RW3 solid water (PTW-Freiburg, Germany) was used to measure the IMRT fields delivered with 6 MV photons. The isodose and profiles measured with the film generally agreed to within ± 3% or ± 3 mm with the planned doses, in some regions (outside the IMRT field) the match fell to within ± 5%. The isodose distributions of the planning system and the film could be compared on screen and allows for electronic records of the comparison to be kept if so desired. The features and versatility of this software has been of benefit to our IMRT QA program. Furthermore, the VeriSoft TM software allows for quick and accurate, automated planar film analysis.Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  14. Interorganizational Boundary Spanning in Global Software Development

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie; Romani, Laurence

    , and which skills and competencies they draw on in their efforts to deal with emerging cross-cultural issues in a way that paves ground for developing a shared understanding and common platform for the client and vendor representatives. A framework of boundary spanning leadership practices is adapted...... to virtuality and cultural diversity. This paper, which draws on a case study of collaborative work in a global software development project, focuses on key boundary spanners in an Indian vendor company, who are responsible for developing trustful and sustainable client relations and coordinating complex...... projects across multiple cultures, languages, organisational boundaries, time zones and geographical distances. It looks into how these vendor managers get prepared for their complex boundary spanning work, which cross-cultural challenges they experience in their collaboration with Western clients...

  15. PlanetLab Europe as Geographically-Distributed Testbed for Software Development and Evaluation

    Directory of Open Access Journals (Sweden)

    Dan Komosny

    2015-01-01

    Full Text Available In this paper, we analyse the use of PlanetLab Europe for development and evaluation of geographically-oriented Internet services. PlanetLab is a global research network with the main purpose to support development of new Internet services and protocols. PlanetLab is divided into several branches; one of them is PlanetLab Europe. PlanetLab Europe consists of about 350 nodes at 150 geographically different sites. The nodes are accessible by remote login, and the users can run their software on the nodes. In the paper, we study the PlanetLab's properties that are significant for its use as a geographically distributed testbed. This includes node position accuracy, services availability and stability. We find a considerable number of location inaccuracies and a number of services that cannot be considered as reliable. Based on the results we propose a simple approach to nodes selection in testbeds for geographically-oriented Internet services development and evaluation.

  16. Social software in global software development

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2010-01-01

    variety of tools such as: instant messaging, internet forums, mailing lists, blogs, wikis, social network sites, social bookmarking, social libraries, virtual worlds. Though normally rather belonging to the private realm, the use of social software in corporate context has been reported, e.g. as a way...

  17. Globalized Newton-Krylov-Schwarz Algorithms and Software for Parallel Implicit CFD

    Science.gov (United States)

    Gropp, W. D.; Keyes, D. E.; McInnes, L. C.; Tidriri, M. D.

    1998-01-01

    Implicit solution methods are important in applications modeled by PDEs with disparate temporal and spatial scales. Because such applications require high resolution with reasonable turnaround, "routine" parallelization is essential. The pseudo-transient matrix-free Newton-Krylov-Schwarz (Psi-NKS) algorithmic framework is presented as an answer. We show that, for the classical problem of three-dimensional transonic Euler flow about an M6 wing, Psi-NKS can simultaneously deliver: globalized, asymptotically rapid convergence through adaptive pseudo- transient continuation and Newton's method-, reasonable parallelizability for an implicit method through deferred synchronization and favorable communication-to-computation scaling in the Krylov linear solver; and high per- processor performance through attention to distributed memory and cache locality, especially through the Schwarz preconditioner. Two discouraging features of Psi-NKS methods are their sensitivity to the coding of the underlying PDE discretization and the large number of parameters that must be selected to govern convergence. We therefore distill several recommendations from our experience and from our reading of the literature on various algorithmic components of Psi-NKS, and we describe a freely available, MPI-based portable parallel software implementation of the solver employed here.

  18. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  19. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  20. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...

  1. Global Drought Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Drought Hazard Frequency and Distribution is a 2.5 minute grid based upon the International Research Institute for Climate Prediction's (IRI) Weighted Anomaly...

  2. Distributed Sensor Network Software Development Testing through Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Sean M. [Univ. of New Mexico, Albuquerque, NM (United States)

    2003-12-01

    The distributed sensor network (DSN) presents a novel and highly complex computing platform with dif culties and opportunities that are just beginning to be explored. The potential of sensor networks extends from monitoring for threat reduction, to conducting instant and remote inventories, to ecological surveys. Developing and testing for robust and scalable applications is currently practiced almost exclusively in hardware. The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for DSNs independent of hardware constraints. The exibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness and scaling issues, to explore arbitrary algorithms for distributed sensors, and to defeat those algorithms through simulated failure. The user speci es the topology, the environment, the application, and any number of arbitrary failures; DSS provides the virtual environmental embedding.

  3. Global Software Development: A Review of the State-Of-The-Art (2007-2011)

    DEFF Research Database (Denmark)

    Ali Babar, Muhammad; Zahedi, Mansooreh

    a state-of-the-art review of the GSD research literature published in the main venue of Global Software Engineering in order to identify the main research trends and gaps that needs to be filled by future research. We were also interested placing the findings of our review with respect to a practice......-driven GSD research agenda. Method: We used structured literature review methodology for which we decided to select and review the recently published research papers (i.e., 2007 - 2011) from the International Conference in Global Software Engineering (ICGSE). We used a framework for organizing GSD research...... challenges and threats and a practice-driven research agenda for extracting and organizing the data from the reviewed papers. We used theoretical reasoning for classifying the reviewed papers under different categories, which were mainly based on the framework, a decision that also enabled us to propose...

  4. Why does site visit matter in global software development: A knowledge-based perspective

    DEFF Research Database (Denmark)

    Zahedi, Mansooreh; Babar, Muhammad Ali

    2016-01-01

    Context: Face-to-Face (F2F) interaction is a strong means to foster social relationships and effective knowledge sharing within a team. However, communication in Global Software Development (GSD) teams is usually restricted to computer-mediated conversation that is perceived to be less effective...

  5. Distributed Software Development with One Hand Tied Behind the Back

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Münch, Jürgen

    2016-01-01

    Software development consists to a large extent of human-based processes with continuously increasing demands regarding interdisciplinary team work. Understanding the dynamics of software teams can be seen as highly important to successful project execution. Hence, for future project managers......, knowledge about non-technical processes in teams is significant. In this paper, we present a course unit that provides an environment in which students can learn and experience the role of different communication patterns in distributed agile software development. In particular, students gain awareness...... in virtual teams. We provide a detailed design of the course unit to allow for implementation in further courses. Furthermore, we provide experiences obtained from implementing this course unit with 16 graduate students. We observed students struggling with technical aspects and team coordination in general...

  6. Software Test Description (STD) for the Globally Relocatable Navy Tide/Atmospheric Modeling System (PCTides)

    National Research Council Canada - National Science Library

    Posey, Pamela

    2002-01-01

    The purpose of this Software Test Description (STD) is to establish formal test cases to be used by personnel tasked with the installation and verification of the Globally Relocatable Navy Tide/Atmospheric Modeling System (PCTides...

  7. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    International Nuclear Information System (INIS)

    Scheidt, Rafael de Faria; Vilain, Patrícia; Dantas, M A R

    2014-01-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers

  8. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    Science.gov (United States)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  9. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  10. Strategic Orientation in the Globalization of Software Firms

    Science.gov (United States)

    Dedrick, Jason; Kraemer, Kenneth L.; Carmel, Erran; Dunkle, Debora

    In the search for profits, software firms are globalizing their development activities. Some firms achieve greater profits by becoming more efficient, whereas others do so by reaching new markets; some do both. This paper creates an a priori typology of strategies based on the extent to which firms are focused on operational improvement or market access, have a dual focus or are unfocused. We find that firms with these strategies differ in degree of internationalization, organization of offshoring and performance outcomes related to offshoring. Market-oriented firms receive a greater proportion of their total revenue from sales outside the U.S., showing a greater international orientation. They keep more of their offshore development in-house via captive operations. They also are most likely to report increased non-U.S. sales as a result of offshoring. On the other hand, operations-oriented firms have lower levels of international sales, are more likely to go offshore via outsourced software development, and achieve greater costs savings and labor force flexibility as a result of offshoring. Operations-oriented firms also face more obstacles in offshoring, perhaps because of their reliance on outsourcing. Dual focus firms generally achieve some of the best of both strategies, whereas unfocused firms achieve lower cost benefits.

  11. Exploring Coordination Structures in Open Source Software Development

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Harmsen, Frank; Hegeman, J.H.; Amrit, Chintan Amrit; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    Coordination is difficult to achieve in a large globally distributed project setting. The problem is multiplied in open source software development projects, where most of the traditional means of coordination such as plans, system-level designs, schedules and defined process are not used. In order

  12. Globalization and the income distribution between the countries

    OpenAIRE

    Georgieva Svrtinov, Vesna; Gorgieva-Trajkovska, Olivera; Temjanovski, Riste

    2014-01-01

    Globalization is contested concept. In general, it is considered to be beneficial for the growth of economy. But, there are also many adverse effects of globalization on growth in many developing countries. It increases poverty and worsens the income distribution. Globalization has raised powerful debate between optimists and pessimists. There are pro and against globalization debates, with strong arguments in both sides. The objective of this paper is to analyze the relationship betwe...

  13. Statistical distributions of optimal global alignment scores of random protein sequences

    Directory of Open Access Journals (Sweden)

    Tang Jiaowei

    2005-10-01

    Full Text Available Abstract Background The inference of homology from statistically significant sequence similarity is a central issue in sequence alignments. So far the statistical distribution function underlying the optimal global alignments has not been completely determined. Results In this study, random and real but unrelated sequences prepared in six different ways were selected as reference datasets to obtain their respective statistical distributions of global alignment scores. All alignments were carried out with the Needleman-Wunsch algorithm and optimal scores were fitted to the Gumbel, normal and gamma distributions respectively. The three-parameter gamma distribution performs the best as the theoretical distribution function of global alignment scores, as it agrees perfectly well with the distribution of alignment scores. The normal distribution also agrees well with the score distribution frequencies when the shape parameter of the gamma distribution is sufficiently large, for this is the scenario when the normal distribution can be viewed as an approximation of the gamma distribution. Conclusion We have shown that the optimal global alignment scores of random protein sequences fit the three-parameter gamma distribution function. This would be useful for the inference of homology between sequences whose relationship is unknown, through the evaluation of gamma distribution significance between sequences.

  14. Global marine plankton functional type biomass distributions : Phaeocystis spp

    NARCIS (Netherlands)

    Vogt, M.; O'Brien, C.; Peloquin, J.; Schoemann, V.; Breton, E.; Estrada, M.; Gibson, J.; Karentz, D.; van Leeuwe, M. A.; Stefels, J.; Widdicombe, C.; Peperzak, L.

    2012-01-01

    The planktonic haptophyte Phaeocystis has been suggested to play a fundamental role in the global biogeochemical cycling of carbon and sulphur, but little is known about its global biomass distribution. We have collected global microscopy data of the genus Phaeocystis and converted abundance data to

  15. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  16. Portable software for distributed readout controllers and event builders in FASTBUS and VME

    International Nuclear Information System (INIS)

    Pordes, R.; Berg, D.; Berman, E.; Bernett, M.; Brown, D.; Constanta-Fanourakis, P.; Dorries, T.; Haire, M.; Joshi, U.; Kaczar, K.; Mackinnon, B.; Moore, C.; Nicinski, T.; Oleynik, G.; Petravick, D.; Sergey, G.; Slimmer, D.; Streets, J.; Votava, M.; White, V.

    1989-12-01

    We report on software developed as part of the PAN-DA system to support the functions of front end readout controllers and event builders in multiprocessor, multilevel, distributed data acquisition systems. For the next generation data acquisition system we have undertaken to design and implement software tools that are easily transportable to new modules. The first implementation of this software is for Motorola 68K series processor boards in FASTBUS and VME and will be used in the Fermilab accelerator run at the beginning of 1990. We use a Real Time Kernel Operating System. The software provides general connectivity tools for control, diagnosis and monitoring. 17 refs., 7 figs

  17. A Real-Time Fault Management Software System for Distributed Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — DyMA-FM (Dynamic Multivariate Assessment for Fault Management) is a software architecture for real-time fault management. Designed to run in a distributed...

  18. Web-based surveillance and global Salmonella distribution, 2000-2002

    DEFF Research Database (Denmark)

    Galanis, E.; Wong, Danilo Lo Fo; Patrick, M.E.

    2006-01-01

    Salmonellae are a common cause of foodborne disease worldwide. The World Health Organization (WHO) supports international foodborne disease surveillance through WHO Global Salm-Surv and other activities. WHO Global Salm-Surv members annually report the 15 most frequently isolated Salmonella...... serotypes to a Web-based country databank. We describe the global distribution of reported Salmonella serotypes from human and nonhuman sources from 2000 to 2002. Among human isolates, Salmonella enterica serovar Enteritidis was the most common serotype, accounting for 65% of all isolates. Among nonhuman...... professionals to explore hypotheses related to the sources and distribution of salmonellae worldwide....

  19. DYNAMIC SOFTWARE TESTING MODELS WITH PROBABILISTIC PARAMETERS FOR FAULT DETECTION AND ERLANG DISTRIBUTION FOR FAULT RESOLUTION DURATION

    Directory of Open Access Journals (Sweden)

    A. D. Khomonenko

    2016-07-01

    Full Text Available Subject of Research.Software reliability and test planning models are studied taking into account the probabilistic nature of error detection and discovering. Modeling of software testing enables to plan the resources and final quality at early stages of project execution. Methods. Two dynamic models of processes (strategies are suggested for software testing, using error detection probability for each software module. The Erlang distribution is used for arbitrary distribution approximation of fault resolution duration. The exponential distribution is used for approximation of fault resolution discovering. For each strategy, modified labeled graphs are built, along with differential equation systems and their numerical solutions. The latter makes it possible to compute probabilistic characteristics of the test processes and states: probability states, distribution functions for fault detection and elimination, mathematical expectations of random variables, amount of detected or fixed errors. Evaluation of Results. Probabilistic characteristics for software development projects were calculated using suggested models. The strategies have been compared by their quality indexes. Required debugging time to achieve the specified quality goals was calculated. The calculation results are used for time and resources planning for new projects. Practical Relevance. The proposed models give the possibility to use the reliability estimates for each individual module. The Erlang approximation removes restrictions on the use of arbitrary time distribution for fault resolution duration. It improves the accuracy of software test process modeling and helps to take into account the viability (power of the tests. With the use of these models we can search for ways to improve software reliability by generating tests which detect errors with the highest probability.

  20. Development of Ada language control software for the NASA power management and distribution test bed

    Science.gov (United States)

    Wright, Ted; Mackin, Michael; Gantose, Dave

    1989-01-01

    The Ada language software developed to control the NASA Lewis Research Center's Power Management and Distribution testbed is described. The testbed is a reduced-scale prototype of the electric power system to be used on space station Freedom. It is designed to develop and test hardware and software for a 20-kHz power distribution system. The distributed, multiprocessor, testbed control system has an easy-to-use operator interface with an understandable English-text format. A simple interface for algorithm writers that uses the same commands as the operator interface is provided, encouraging interactive exploration of the system.

  1. Web based parallel/distributed medical data mining using software agents

    Energy Technology Data Exchange (ETDEWEB)

    Kargupta, H.; Stafford, B.; Hamzaoglu, I.

    1997-12-31

    This paper describes an experimental parallel/distributed data mining system PADMA (PArallel Data Mining Agents) that uses software agents for local data accessing and analysis and a web based interface for interactive data visualization. It also presents the results of applying PADMA for detecting patterns in unstructured texts of postmortem reports and laboratory test data for Hepatitis C patients.

  2. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  3. Gridded Species Distribution, Version 1: Global Amphibians Presence Grids

    Data.gov (United States)

    National Aeronautics and Space Administration — The Global Amphibians Presence Grids of the Gridded Species Distribution, Version 1 is a reclassified version of the original grids of amphibian species distribution...

  4. The dBoard: a Digital Scrum Board for Distributed Software Development

    DEFF Research Database (Denmark)

    Esbensen, Morten; Tell, Paolo; Cholewa, Jacob Benjamin

    2015-01-01

    In this paper we present the dBoard - a digital Scrum Board for distributed Agile software development teams. The dBoard is designed as a 'virtual window' between two Scrum team spaces. It connects two locations with live video and audio, which is overlaid with a synchronized and interactive...... digital Scrum board, and it adapts the fidelity of the video/audio to the presence of people in front of it. The dBoard is designed to work (i) as a passive information radiator from which it is easy to get an overview of the status of work, (ii) as a media space providing awareness about the presence...... of remote co-workers, and (iii) as an active meeting support tool. The paper presents a case study of distributed Scrum in a large software company that motivates the design of the dBoard, and details the design and technical implementation of the dBoard. The paper also reports on an initial user study...

  5. Mapserver – Information Flow Management Software for The Border Guard Distributed Data Exchange System

    OpenAIRE

    Blok Marek; Kaczmarek Sylwester; Młynarczuk Magdalena; Narloch Marcin

    2016-01-01

    In this paper the architecture of the software designed for management of position and identification data of floating and flying objects in Maritime areas controlled by Polish Border Guard is presented. The software was designed for managing information stored in a distributed system with two variants of the software, one for a mobile device installed on a vessel, an airplane or a car and second for a central server. The details of implementation of all functionalities of the MapServer in bo...

  6. The global distribution of deep-water Antipatharia habitat

    Science.gov (United States)

    Yesson, Chris; Bedford, Faye; Rogers, Alex D.; Taylor, Michelle L.

    2017-11-01

    Antipatharia are a diverse group of corals with many species found in deep water. Many Antipatharia are habitat for associates, have extreme longevity and some species can occur beyond 8500 m depth. As they are major constituents of'coral gardens', which are Vulnerable Marine Ecosystems (VMEs), knowledge of their distribution and environmental requirements is an important pre-requisite for informed conservation planning particularly where the expense and difficulty of deep-sea sampling prohibits comprehensive surveys. This study uses a global database of Antipatharia distribution data to perform habitat suitability modelling using the Maxent methodology to estimate the global extent of black coral habitat suitability. The model of habitat suitability is driven by temperature but there is notable influence from other variables of topography, surface productivity and oxygen levels. This model can be used to predict areas of suitable habitat, which can be useful for conservation planning. The global distribution of Antipatharia habitat suitability shows a marked contrast with the distribution of specimen observations, indicating that many potentially suitable areas have not been sampled, and that sampling effort has been disproportionate to shallow, accessible areas inside marine protected areas (MPAs). Although 25% of Antipatharia observations are located in MPAs, only 7-8% of predicted suitable habitat is protected, which is short of the Convention on Biological Diversity target to protect 10% of ocean habitats by 2020.

  7. The Effect of Governance on Global Software Development: An Empirical Research in Transactive Memory Systems.

    NARCIS (Netherlands)

    Manteli, C.; van den Hooff, B.J.; van Vliet, J.C.

    2014-01-01

    Context The way global software development (GSD) activities are managed impacts knowledge transactions between team members. The first is captured in governance decisions, and the latter in a transactive memory system (TMS), a shared cognitive system for encoding, storing and retrieving knowledge

  8. Distributed control software of high-performance control-loop algorithm

    CERN Document Server

    Blanc, D

    1999-01-01

    The majority of industrial cooling and ventilation plants require the control of complex processes. All these processes are highly important for the operation of the machines. The stability and reliability of these processes are leading factors identifying the quality of the service provided. The control system architecture and software structure, as well, are required to have high dynamical performance and robust behaviour. The intelligent systems based on PID or RST controllers are used for their high level of stability and accuracy. The design and tuning of these complex controllers require the dynamic model of the plant to be known (generally obtained by identification) and the desired performance of the various control loops to be specified for achieving good performances. The concept of having a distributed control algorithm software provides full automation facilities with well-adapted functionality and good performances, giving methodology, means and tools to master the dynamic process optimization an...

  9. Action-embedded transformational leadership in self-managing global information systems development teams

    NARCIS (Netherlands)

    Eseryel, U. Yeliz; Eseryel, Deniz

    While software development teams are becoming more and more distributed around the globe, most software development methodologies used by global teams prescribe self-managing teams. Transformational leadership is the key to successful information systems development and use for competitive

  10. Harmonic Domain Modeling of a Distribution System Using the DIgSILENT PowerFactory Software

    DEFF Research Database (Denmark)

    Wasilewski, J.; Wiechowski, Wojciech Tomasz; Bak, Claus Leth

    The first part of this paper presents the comparison between two models of distribution system created in computer simulation software PowerFactory (PF). Model A is an exciting simplified equivalent model of the distribution system used by Transmission System Operator (TSO) Eltra for balenced load...

  11. Forecasting an invasive species’ distribution with global distribution data, local data, and physiological information

    Science.gov (United States)

    Jarnevich, Catherine S.; Young, Nicholas E.; Talbert, Marian; Talbert, Colin

    2018-01-01

    Understanding invasive species distributions and potential invasions often requires broad‐scale information on the environmental tolerances of the species. Further, resource managers are often faced with knowing these broad‐scale relationships as well as nuanced environmental factors related to their landscape that influence where an invasive species occurs and potentially could occur. Using invasive buffelgrass (Cenchrus ciliaris), we developed global models and local models for Saguaro National Park, Arizona, USA, based on location records and literature on physiological tolerances to environmental factors to investigate whether environmental relationships of a species at a global scale are also important at local scales. In addition to correlative models with five commonly used algorithms, we also developed a model using a priori user‐defined relationships between occurrence and environmental characteristics based on a literature review. All correlative models at both scales performed well based on statistical evaluations. The user‐defined curves closely matched those produced by the correlative models, indicating that the correlative models may be capturing mechanisms driving the distribution of buffelgrass. Given climate projections for the region, both global and local models indicate that conditions at Saguaro National Park may become more suitable for buffelgrass. Combining global and local data with correlative models and physiological information provided a holistic approach to forecasting invasive species distributions.

  12. Understanding Structures and Affordances of Extended Teams in Global Software Development

    DEFF Research Database (Denmark)

    Ali Babar, Muhammad; Zahedi, Mansooreh

    2013-01-01

    Growing popularity of Global Software Development (GSD) has resulted in an increasing number of cross-organizational teams that are formed according to Extended Team Model (ETM). There is little known about the structures (work, social, and communication) that may exist in ETM and what affordances...... in the studied team help deal with different GSD challenges, these structures appear to have certain challenges inherent in them and the affordances they provide. We make a few recommendations for improving the current structures to deal with the observed challenges. Our findings are expected to provide insights...

  13. A resilient and secure software platform and architecture for distributed spacecraft

    Science.gov (United States)

    Otte, William R.; Dubey, Abhishek; Karsai, Gabor

    2014-06-01

    A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.

  14. Simulating awareness in global software engineering: a comparative analysis of Scrum and Agile Service Networks

    NARCIS (Netherlands)

    Tamburri, D.A.; Razo Zapata, I.S.; Fernandez, H.; Tedeschi, C.

    2012-01-01

    Abstract—Global software engineering (GSE) is a business strategy to realize a business idea (i.e. the development project) faster, through round-the-clock productivity. However, GSE creates a volatile and unstable process in which many actors interact together against unpredictable premises (e.g.

  15. Global Distribution of Marine Radioactivity. Chapter 2

    International Nuclear Information System (INIS)

    Zal U'yun Wan Mahmood; Abdul Kadir Ishak; Norfaizal Mohamad; Wo, Y.M.; Kamarudin Samuding

    2015-01-01

    The global distribution of radionuclide activity in marine environments are totally different for each regions. This is because the sources for the supply, space, time, season, nature (physical, chemical and geochemical) and the nature of ocean physical (waves) differentiates it.

  16. "Figure Out How to Code with the Hands of Others”: Recognizing Cultural Blind Spots in Global Software Development

    DEFF Research Database (Denmark)

    Matthiesen, Stina; Bjørn, Pernille; Petersen, Lise Møller

    2014-01-01

    We report on an ethnographic study of an outsourcing global software development (GSD) setup between an Indian IT vendor and an IT development division of a Danish bank. We investigate how the local IT development work is shaped by the global setup in GSD and argue that the bank had cultural blin...

  17. Variability in global ocean phytoplankton distribution over 1979-2007

    Science.gov (United States)

    Masotti, I.; Alvain, S.; Moulin, C.; Antoine, D.

    2009-04-01

    Recently, reanalysis of long-term ocean color data (CZCS and SeaWiFS; Antoine et al., 2005) has shown that world ocean average phytoplankton chlorophyll levels show an increase of 20% over the last two decades. It is however unknown whether this increase is associated with a change in the distribution of phytoplankton groups or if it simply corresponds to an increase of the productivity. Within the framework of the GLOBPHY project, the distribution of the phytoplankton groups was monitored by applying the PHYSAT method (Alvain et al., 2005) to the historical ocean color data series from CZCS, OCTS and SeaWiFS sensors. The PHYSAT algorithm allows identification of several phytoplankton, like nanoeucaryotes, prochlorococcus, synechococcus and diatoms. Because both sensors (OCTS-SeaWiFS) are very similar, OCTS data were processed with the standard PHYSAT algorithm to cover the 1996-1997 period during which a large El Niño event occurred, just before the SeaWiFS era. Our analysis of this dataset (1996-2006) evidences a strong variability in the distribution of phytoplankton groups at both regional and global scales. In the equatorial region (0°-5°S), a three-fold increase of nanoeucaryotes frequency was detected in opposition to a two-fold decrease of synechococcus during the early stages of El Niño conditions (May-June 1997, OCTS). The impact of this El Niño is however not confined to the Equatorial Pacific and has affected the global ocean. The processing of CZCS data with PHYSAT has required several adaptations of this algorithm due to the lower performances and the reduced number of spectral bands of the sensor. Despites higher uncertainties, the phytoplankton groups distribution obtained with CZCS is globally consistent with that of SeaWiFS. A comparison of variability in global phytoplankton distribution between 1979-1982 (CZCS) and 1999-2002 (SeaWiFS) suggests an increase in nanoeucaryotes at high latitudes (>40°) and in the equatorial region (10°S-10

  18. Activity Theory applied to Global Software Engineering: Theoretical Foundations and Implications for Tool Builders

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    2012-01-01

    Although a plethora of tools are available for Global Software Engineering (GSE) teams, it is being realized increasingly that the most prevalent desktop metaphor underpinning the majority of tools have several inherent limitations. We have proposed that Activity-Based Computing (ABC) can be a pr...... in building supporting infrastructure for GSE, and describe a proof of concept prototype....

  19. Requisite Information Collaboration and Distributed Knowledge Management in Software Development

    DEFF Research Database (Denmark)

    Petersen, Mogens K.; Bjørn, Pernille; Frank, L.

    distributed knowledge management product state models. The paper draws upon a series of discussion with Scandinavian IT Group (SIG). With an interest in how performance in their new organization develops SIG invited the research group to study measures of organizational performance and the use and effect...... of knowledge management tools in software development. The paper does not represent the viewpoint of SIG but outline our framework and major research questions....

  20. Global Distribution of Dissected Duricrust on Mars

    Science.gov (United States)

    Mustard, J. F.; Cooper, C. D.

    2000-01-01

    Evidence for dissected duricrust was identified in high resolution MOC images. Analysis of all available images was used to map the global distribution of this terrain. It is apparently restricted to two latitude bands: 30-60 deg. N and 30-60 deg. S.

  1. Global Land Carbon Uptake from Trait Distributions

    Science.gov (United States)

    Butler, E. E.; Datta, A.; Flores-Moreno, H.; Fazayeli, F.; Chen, M.; Wythers, K. R.; Banerjee, A.; Atkin, O. K.; Kattge, J.; Reich, P. B.

    2016-12-01

    Historically, functional diversity in land surface models has been represented through a range of plant functional types (PFTs), each of which has a single value for all of its functional traits. Here we expand the diversity of the land surface by using a distribution of trait values for each PFT. The data for these trait distributions is from a sub-set of the global database of plant traits, TRY, and this analysis uses three leaf traits: mass based nitrogen and phosphorus content and specific leaf area, which influence both photosynthesis and respiration. The data are extrapolated into continuous surfaces through two methodologies. The first, a categorical method, classifies the species observed in TRY into satellite estimates of their plant functional type abundances - analogous to how traits are currently assigned to PFTs in land surface models. Second, a Bayesian spatial method which additionally estimates how the distribution of a trait changes in accord with both climate and soil covariates. These two methods produce distinct patterns of diversity which are incorporated into a land surface model to estimate how the range of trait values affects the global land carbon budget.

  2. Mapping local and global variability in plant trait distributions

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Ethan E.; Datta, Abhirup; Flores-Moreno, Habacuc; Chen, Ming; Wythers, Kirk R.; Fazayeli, Farideh; Banerjee, Arindam; Atkin, Owen K.; Kattge, Jens; Amiaud, Bernard; Blonder, Benjamin; Boenisch, Gerhard; Bond-Lamberty, Ben; Brown, Kerry A.; Byun, Chaeho; Campetella, Giandiego; Cerabolini, Bruno E. L.; Cornelissen, Johannes H. C.; Craine, Joseph M.; Craven, Dylan; de Vries, Franciska T.; Díaz, Sandra; Domingues, Tomas F.; Forey, Estelle; González-Melo, Andrés; Gross, Nicolas; Han, Wenxuan; Hattingh, Wesley N.; Hickler, Thomas; Jansen, Steven; Kramer, Koen; Kraft, Nathan J. B.; Kurokawa, Hiroko; Laughlin, Daniel C.; Meir, Patrick; Minden, Vanessa; Niinemets, Ülo; Onoda, Yusuke; Peñuelas, Josep; Read, Quentin; Sack, Lawren; Schamp, Brandon; Soudzilovskaia, Nadejda A.; Spasojevic, Marko J.; Sosinski, Enio; Thornton, Peter E.; Valladares, Fernando; van Bodegom, Peter M.; Williams, Mathew; Wirth, Christian; Reich, Peter B.

    2017-12-01

    Accurate trait-environment relationships and global maps of plant trait distributions represent a needed stepping stone in global biogeography and are critical constraints of key parameters for land models. Here, we use a global data set of plant traits to map trait distributions closely coupled to photosynthesis and foliar respiration: specific leaf area (SLA), and dry mass-based concentrations of leaf nitrogen (Nm) and phosphorus (Pm); We propose two models to extrapolate geographically sparse point data to continuous spatial surfaces. The first is a categorical model using species mean trait values, categorized into plant functional types (PFTs) and extrapolating to PFT occurrence ranges identified by remote sensing. The second is a Bayesian spatial model that incorporates information about PFT, location and environmental covariates to estimate trait distributions. Both models are further stratified by varying the number of PFTs; The performance of the models was evaluated based on their explanatory and predictive ability. The Bayesian spatial model leveraging the largest number of PFTs produced the best maps; The interpolation of full trait distributions enables a wider diversity of vegetation to be represented across the land surface. These maps may be used as input to Earth System Models and to evaluate other estimates of functional diversity.

  3. Translocality in Global Software Development

    DEFF Research Database (Denmark)

    Bjørn, Pernille; Søderberg, Anne-Marie; Krishna, S.

    2017-01-01

    . We explored how agile processes in global outsourcing impacts work conditions of the Indian IT developers, and were surprised to find that agile methodologies, even after 3 years of implementation, created a stressful and inflexible work environment negatively impacting their personal lives. Many......What happens when agile methods are introduced in global outsourcing set-ups? Agile methods are designed to empower IT developers in decision-making through self-managing collocated teams. We studied how agile methods were introduced into global outsourcing from the Indian IT vendor’s perspective...... of the negative aspects of work, which agile methodologies were developed to reduce, were evident in the global agile outsourcing set-up. We propose translocality to repudiate the dichotomy of global/local reminding us that methodologies and technologies must be understood as immediately localized and situated...

  4. Mapserver – Information Flow Management Software for The Border Guard Distributed Data Exchange System

    Directory of Open Access Journals (Sweden)

    Blok Marek

    2016-09-01

    Full Text Available In this paper the architecture of the software designed for management of position and identification data of floating and flying objects in Maritime areas controlled by Polish Border Guard is presented. The software was designed for managing information stored in a distributed system with two variants of the software, one for a mobile device installed on a vessel, an airplane or a car and second for a central server. The details of implementation of all functionalities of the MapServer in both, mobile and central, versions are briefly presented on the basis of information flow diagrams.

  5. Globalization and the distribution of income: The economic arguments

    Science.gov (United States)

    Jones, Ronald W.

    2003-01-01

    One of the issues currently being debated in the ongoing discussion of the pros and cons of today's globalization concerns the effects of greater world trade as well as of the changes in technology on a country's internal distribution of income, especially on skilled versus unskilled wage rates. In this article, I attempt to spell out some of the arguments concerning internal income distribution that have been put forth both by labor economists and international trade theorists. The impact of globalization on the wage premium between the skilled and unskilled may not be as obvious as is first imagined. PMID:12960390

  6. A systematic review of knowledge sharing challenges and practices in global software development

    DEFF Research Database (Denmark)

    Zahedi, Mansooreh; Shahin, Mojtaba; Babar, Muhammad Ali

    2016-01-01

    the data extracted from the reviewed primary studies. Results: Our findings revealed that knowledge sharing challenges and practices in GSD could be classi- fied in 6 main themes: management, team structure, work processes/practices, team cognition, social attributes and technology. In regard to contextual......Context: Global Software Development (GSD) presents significant challenges to share and understand knowledge required for developing software. Organizations are expected to implement appropriate practices to address knowledge-sharing challenges in GSD. With the growing literature on GSD and its...... and practices fall under the theme of “work practices”. (c) The technology related knowledge-sharing challenges are the least reported; we discussed the available technologies for sup- porting knowledge sharing needs in GSD. (d) The organizational contextual information is missing from a large number of studies...

  7. Collaborative Windows – A User Interface Concept for Distributed Collaboration

    DEFF Research Database (Denmark)

    Esbensen, Morten

    2016-01-01

    where close collaboration and frequent meetings drive the work. One way to achieve this way of working is to implement the Scrum software development framework. Implementing Scrum in globalized context however, requires transforming the Scrum development methods to a distributed setup and extensive use...... of collaboration technologies. In this dissertation, I explore how novel collaboration technologies can support closely coupled distributed work such as that in distributed Scrum. This research is based on three different studies: an ethnographic field study of distributed Scrum between Danish and Indian software...

  8. Global and local consistencies in distributed fault diagnosis for discrete-event systems

    NARCIS (Netherlands)

    Su, R.; Wonham, W.M.

    2005-01-01

    In this paper, we present a unified framework for distributed diagnosis. We first introduce the concepts of global and local consistency in terms of supremal global and local supports, then present two distributed diagnosis problems based on them. After that, we provide algorithms to achieve

  9. Towards an Understanding of Enabling Process Knowing in Global Software Development: A Case Study

    DEFF Research Database (Denmark)

    Zahedi, Mansooreh; Babar, Muhammad Ali

    2014-01-01

    Shared understanding of Software Engineering (SE) processes, that we call process knowing, is required for effective communication and coordination and communication within a team in order to improve team performance. SE Process knowledge can include roles, responsibilities and flow of informatio...... challenges of lack of process knowing and how an organization can enable process knowing for achieving the desired results that also help in increasing social interactions and positive behavioral changes......Shared understanding of Software Engineering (SE) processes, that we call process knowing, is required for effective communication and coordination and communication within a team in order to improve team performance. SE Process knowledge can include roles, responsibilities and flow of information...... over a project lifecycle. Developing and sustaining process knowledge can be more challenging in Global Software Development (GSD). GSD distances can limit the ability of a team to develop a common understanding of processes. Anecdotes of the problems caused by lack of common understanding of processes...

  10. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  11. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  12. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    Science.gov (United States)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  13. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  14. The global distribution of mineral dust

    International Nuclear Information System (INIS)

    Tegen, I; Schepanski, K

    2009-01-01

    Dust aerosol particles produced by wind erosion in arid and semi arid regions affect climate and air quality, but the magnitude of these effects is largely unquantified. The major dust source regions include the Sahara, the Arabian and Asian deserts; global annual dust emissions are currently estimated to range between 1000 and 3000 Mt/yr. Dust aerosol can be transported over long distances of thousands of kilometers, e.g. from source regions in the Saharan desert over the North Atlantic, or from the Asian deserts towards the Pacific Ocean. The atmospheric dust load varies considerably on different timescales. While dust aerosol distribution and dust effects are important on global scales, they strongly depend on dust emissions that are controlled on small spatial and temporal scales.

  15. Global distribution of urban parameters derived from high-resolution global datasets for weather modelling

    Science.gov (United States)

    Kawano, N.; Varquez, A. C. G.; Dong, Y.; Kanda, M.

    2016-12-01

    Numerical model such as Weather Research and Forecasting model coupled with single-layer Urban Canopy Model (WRF-UCM) is one of the powerful tools to investigate urban heat island. Urban parameters such as average building height (Have), plain area index (λp) and frontal area index (λf), are necessary inputs for the model. In general, these parameters are uniformly assumed in WRF-UCM but this leads to unrealistic urban representation. Distributed urban parameters can also be incorporated into WRF-UCM to consider a detail urban effect. The problem is that distributed building information is not readily available for most megacities especially in developing countries. Furthermore, acquiring real building parameters often require huge amount of time and money. In this study, we investigated the potential of using globally available satellite-captured datasets for the estimation of the parameters, Have, λp, and λf. Global datasets comprised of high spatial resolution population dataset (LandScan by Oak Ridge National Laboratory), nighttime lights (NOAA), and vegetation fraction (NASA). True samples of Have, λp, and λf were acquired from actual building footprints from satellite images and 3D building database of Tokyo, New York, Paris, Melbourne, Istanbul, Jakarta and so on. Regression equations were then derived from the block-averaging of spatial pairs of real parameters and global datasets. Results show that two regression curves to estimate Have and λf from the combination of population and nightlight are necessary depending on the city's level of development. An index which can be used to decide which equation to use for a city is the Gross Domestic Product (GDP). On the other hand, λphas less dependence on GDP but indicated a negative relationship to vegetation fraction. Finally, a simplified but precise approximation of urban parameters through readily-available, high-resolution global datasets and our derived regressions can be utilized to estimate a

  16. Patterns of inequality: Dynamics of income distribution in USA and global energy consumption distribution

    Science.gov (United States)

    Banerjee, Anand; Yakovenko, Victor

    2010-03-01

    Applying the principle of entropy maximization, we argued that the distribution of money in a closed economic system should be exponential [1], see also recent review [2]. In this talk, we show that income distribution in USA is exponential for the majority of population (about 97%). However, the high-income tail follows a power law and is highly dynamical, i.e., out of equilibrium. The fraction of income going to the tail swelled to 20% of all income in 2000 and 2006 at the peaks of speculative bubbles followed by spectacular crashes. Next, we analyze the global distribution of energy consumption per capita among different countries. In the first approximation, it is reasonably well captured by the exponential function. Comparing the data for 1990 and 2005, we observe that the distribution is getting closer to the exponential, presumably as a result of globalization of the world economy.[4pt] [1] A. A. Dragulescu and V. M. Yakovenko, Eur. Phys. J. B 17, 723 (2000). [2] V. M. Yakovenko and J. B. Rosser, to appear in Rev. Mod. Phys. (2009), arXiv:0905.1518.

  17. A fully traits-based approach to modeling global vegetation distribution

    NARCIS (Netherlands)

    Bodegom, van P.M.; Douma, J.C.; Verheijen, L.M.

    2014-01-01

    Dynamic Global Vegetation Models (DGVMs) are indispensable for our understanding of climate change impacts. The application of traits in DGVMs is increasingly refined. However, a comprehensive analysis of the direct impacts of trait variation on global vegetation distribution does not yet exist.

  18. An alternative model to distribute VO software to WLCG sites based on CernVM-FS: a prototype at PIC Tier1

    International Nuclear Information System (INIS)

    Lanciotti, E; Merino, G; Blomer, J; Bria, A

    2011-01-01

    In a distributed computing model as WLCG the software of experiment specific application software has to be efficiently distributed to any site of the Grid. Application software is currently installed in a shared area of the site visible for all Worker Nodes (WNs) of the site through some protocol (NFS, AFS or other). The software is installed at the site by jobs which run on a privileged node of the computing farm where the shared area is mounted in write mode. This model presents several drawbacks which cause a non-negligible rate of job failure. An alternative model for software distribution based on the CERN Virtual Machine File System (CernVM-FS) has been tried at PIC, the Spanish Tierl site of WLCG. The test bed used and the results are presented in this paper.

  19. Orchestration of Globally Distributed Knowledge for Innovation in Multinational Companies

    DEFF Research Database (Denmark)

    Sajadirad, Solmaz; Lassen, Astrid Heidemann

    Conducting a multiple-case study in five companies from Danish industry, this paper explores how multinational companies orchestrate knowledge from their globally distributed subsidiaries for innovation. Comparisons of knowledge orchestration within headquarter and subsidiaries for improvement...... and innovation show that a combination of the dynamic use of inter-firm objects and a well-established knowledge orchestration process underlies knowledge orchestration for innovation in multinational companies, as it advances headquarters’ abilities to effectively acquire, evaluate, disseminate, and utilize...... globally distributed knowledge. This study contributes to the understanding of knowledge orchestration between headquarter and distributed subsidiaries in multinational companies and how it is related to innovation. Specifically, this paper has important implications regarding the use of inter-firm objects...

  20. Harmonic Domain Modeling of a Distribution System Using the DIgSILENT PowerFactory Software

    OpenAIRE

    Wasilewski, J.; Wiechowski, Wojciech Tomasz; Bak, Claus Leth

    2005-01-01

    The first part of this paper presents the comparison between two models of distribution system created in computer simulation software PowerFactory (PF). Model A is an exciting simplified equivalent model of the distribution system used by Transmission System Operator (TSO) Eltra for balenced load-flow calculation and stability studies. Model B is accurate model of the distribution system created on the basis of the detailed data of the investigated network and is used as a reference. The har...

  1. Climate Controls AM Fungal Distributions from Global to Local Scales

    Science.gov (United States)

    Kivlin, S. N.; Hawkes, C.; Muscarella, R.; Treseder, K. K.; Kazenel, M.; Lynn, J.; Rudgers, J.

    2016-12-01

    Arbuscular mycorrhizal (AM) fungi have key functions in terrestrial biogeochemical processes; thus, determining the relative importance of climate, edaphic factors, and plant community composition on their geographic distributions can improve predictions of their sensitivity to global change. Local adaptation by AM fungi to plant hosts, soil nutrients, and climate suggests that all of these factors may control fungal geographic distributions, but their relative importance is unknown. We created species distribution models for 142 AM fungal taxa at the global scale with data from GenBank. We compared climate variables (BioClim and soil moisture), edaphic variables (phosphorus, carbon, pH, and clay content), and plant variables using model selection on models with (1) all variables, (2) climatic variables only (including soil moisture) and (3) resource-related variables only (all other soil parameters and NPP) using the MaxEnt algorithm evaluated with ENMEval. We also evaluated whether drivers of AM fungal distributions were phylogenetically conserved. To test whether global correlates of AM fungal distributions were reflected at local scales, we then surveyed AM fungi in nine plant hosts along three elevation gradients in the Upper Gunnison Basin, Colorado, USA. At the global scale, the distributions of 55% of AM fungal taxa were affected by both climate and soil resources, whereas 16% were only affected by climate and 29% were only affected by soil resources. Even for AM fungi that were affected by both climate and resources, the effects of climatic variables nearly always outweighed those of resources. Soil moisture and isothermality were the main climatic and NPP and soil carbon the main resource related factors influencing AM fungal distributions. Distributions of closely related AM fungal taxa were similarly affected by climate, but not by resources. Local scale surveys of AM fungi across elevations confirmed that climate was a key driver of AM fungal

  2. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  3. Prediction of monthly average global solar radiation based on statistical distribution of clearness index

    International Nuclear Information System (INIS)

    Ayodele, T.R.; Ogunjuyigbe, A.S.O.

    2015-01-01

    In this paper, probability distribution of clearness index is proposed for the prediction of global solar radiation. First, the clearness index is obtained from the past data of global solar radiation, then, the parameters of the appropriate distribution that best fit the clearness index are determined. The global solar radiation is thereafter predicted from the clearness index using inverse transformation of the cumulative distribution function. To validate the proposed method, eight years global solar radiation data (2000–2007) of Ibadan, Nigeria are used to determine the parameters of appropriate probability distribution for clearness index. The calculated parameters are then used to predict the future monthly average global solar radiation for the following year (2008). The predicted values are compared with the measured values using four statistical tests: the Root Mean Square Error (RMSE), MAE (Mean Absolute Error), MAPE (Mean Absolute Percentage Error) and the coefficient of determination (R"2). The proposed method is also compared to the existing regression models. The results show that logistic distribution provides the best fit for clearness index of Ibadan and the proposed method is effective in predicting the monthly average global solar radiation with overall RMSE of 0.383 MJ/m"2/day, MAE of 0.295 MJ/m"2/day, MAPE of 2% and R"2 of 0.967. - Highlights: • Distribution of clearnes index is proposed for prediction of global solar radiation. • The clearness index is obtained from the past data of global solar radiation. • The parameters of distribution that best fit the clearness index are determined. • Solar radiation is predicted from the clearness index using inverse transformation. • The method is effective in predicting the monthly average global solar radiation.

  4. Global time-size distribution of volcanic eruptions on Earth.

    Science.gov (United States)

    Papale, Paolo

    2018-05-01

    Volcanic eruptions differ enormously in their size and impacts, ranging from quiet lava flow effusions along the volcano flanks to colossal events with the potential to affect our entire civilization. Knowledge of the time and size distribution of volcanic eruptions is of obvious relevance for understanding the dynamics and behavior of the Earth system, as well as for defining global volcanic risk. From the analysis of recent global databases of volcanic eruptions extending back to more than 2 million years, I show here that the return times of eruptions with similar magnitude follow an exponential distribution. The associated relative frequency of eruptions with different magnitude displays a power law, scale-invariant distribution over at least six orders of magnitude. These results suggest that similar mechanisms subtend to explosive eruptions from small to colossal, raising concerns on the theoretical possibility to predict the magnitude and impact of impending volcanic eruptions.

  5. Mapping the spatial distribution of global anthropogenic mercury atmospheric emission inventories

    Science.gov (United States)

    Wilson, Simon J.; Steenhuisen, Frits; Pacyna, Jozef M.; Pacyna, Elisabeth G.

    This paper describes the procedures employed to spatially distribute global inventories of anthropogenic emissions of mercury to the atmosphere, prepared by Pacyna, E.G., Pacyna, J.M., Steenhuisen, F., Wilson, S. [2006. Global anthropogenic mercury emission inventory for 2000. Atmospheric Environment, this issue, doi:10.1016/j.atmosenv.2006.03.041], and briefly discusses the results of this work. A new spatially distributed global emission inventory for the (nominal) year 2000, and a revised version of the 1995 inventory are presented. Emissions estimates for total mercury and major species groups are distributed within latitude/longitude-based grids with a resolution of 1×1 and 0.5×0.5°. A key component in the spatial distribution procedure is the use of population distribution as a surrogate parameter to distribute emissions from sources that cannot be accurately geographically located. In this connection, new gridded population datasets were prepared, based on the CEISIN GPW3 datasets (CIESIN, 2004. Gridded Population of the World (GPW), Version 3. Center for International Earth Science Information Network (CIESIN), Columbia University and Centro Internacional de Agricultura Tropical (CIAT). GPW3 data are available at http://beta.sedac.ciesin.columbia.edu/gpw/index.jsp). The spatially distributed emissions inventories and population datasets prepared in the course of this work are available on the Internet at www.amap.no/Resources/HgEmissions/

  6. Center for Adaptive Optics | Software

    Science.gov (United States)

    Optics Software The Center for Adaptive Optics acts as a clearing house for distributing Software to Institutes it gives specialists in Adaptive Optics a place to distribute their software. All software is shared on an "as-is" basis and the users should consult with the software authors with any

  7. Seasonal distributions of diabatic heating during the First GARP Global Experiment

    OpenAIRE

    Ying Wei, Ming; Johnson, Donald R.; Townsend, Ronald D.

    2011-01-01

    The seasonal and annual global distributions of diabatic heating during the First GARP Global Experiment (FGGE) are estimated using the isentropic mass continuity equation. The data used are from the FGGE Level IIIa analyses generated by the United States National Meteorological Center. Spatially and temporally coherent diabatic heating distributions are obtained from the isentropic planetary scale mass circulation that is forced by large-scale heat sources and sinks. The diabatic heating in...

  8. Distributed computing for global health

    CERN Multimedia

    CERN. Geneva; Schwede, Torsten; Moore, Celia; Smith, Thomas E; Williams, Brian; Grey, François

    2005-01-01

    Distributed computing harnesses the power of thousands of computers within organisations or over the Internet. In order to tackle global health problems, several groups of researchers have begun to use this approach to exceed by far the computing power of a single lab. This event illustrates how companies, research institutes and the general public are contributing their computing power to these efforts, and what impact this may have on a range of world health issues. Grids for neglected diseases Vincent Breton, CNRS/EGEE This talk introduces the topic of distributed computing, explaining the similarities and differences between Grid computing, volunteer computing and supercomputing, and outlines the potential of Grid computing for tackling neglected diseases where there is little economic incentive for private R&D efforts. Recent results on malaria drug design using the Grid infrastructure of the EU-funded EGEE project, which is coordinated by CERN and involves 70 partners in Europe, the US and Russi...

  9. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  10. Global quantity for dyons with various charge distributions

    International Nuclear Information System (INIS)

    Koh, I.G.; Kim, Y.

    1980-06-01

    The spatial volume integral of Tr *F F characterizes the dyons globally. This integral is investigated for the dyons with various electric and magnetic charge distributions, which can be probed by the scattering of the test particle in these dyon fields. (author)

  11. Global distribution of pauses observed with satellite measurements

    Indian Academy of Sciences (India)

    We present global distribution of altitudes and temperatures of these pauses observed with long-term space borne high- ... metries between northern and southern hemispheres continue up to the mesopause. We analyze ..... the mean temperature increases from the equa- .... monsoon circulation causes zonal asymmetry in.

  12. BETR global - A geographically-explicit global-scale multimedia contaminant fate model

    International Nuclear Information System (INIS)

    MacLeod, Matthew; Waldow, Harald von; Tay, Pascal; Armitage, James M.; Woehrnschimmel, Henry; Riley, William J.; McKone, Thomas E.; Hungerbuhler, Konrad

    2011-01-01

    We present two new software implementations of the BETR Global multimedia contaminant fate model. The model uses steady-state or non-steady-state mass-balance calculations to describe the fate and transport of persistent organic pollutants using a desktop computer. The global environment is described using a database of long-term average monthly conditions on a 15 o x 15 o grid. We demonstrate BETR Global by modeling the global sources, transport, and removal of decamethylcyclopentasiloxane (D5). - Two new software implementations of the Berkeley-Trent Global Contaminant Fate Model are available. The new model software is illustrated using a case study of the global fate of decamethylcyclopentasiloxane (D5).

  13. Global spatiotemporal distribution of soil respiration modeled using a global database

    Science.gov (United States)

    Hashimoto, S.; Carvalhais, N.; Ito, A.; Migliavacca, M.; Nishina, K.; Reichstein, M.

    2015-07-01

    The flux of carbon dioxide from the soil to the atmosphere (soil respiration) is one of the major fluxes in the global carbon cycle. At present, the accumulated field observation data cover a wide range of geographical locations and climate conditions. However, there are still large uncertainties in the magnitude and spatiotemporal variation of global soil respiration. Using a global soil respiration data set, we developed a climate-driven model of soil respiration by modifying and updating Raich's model, and the global spatiotemporal distribution of soil respiration was examined using this model. The model was applied at a spatial resolution of 0.5°and a monthly time step. Soil respiration was divided into the heterotrophic and autotrophic components of respiration using an empirical model. The estimated mean annual global soil respiration was 91 Pg C yr-1 (between 1965 and 2012; Monte Carlo 95 % confidence interval: 87-95 Pg C yr-1) and increased at the rate of 0.09 Pg C yr-2. The contribution of soil respiration from boreal regions to the total increase in global soil respiration was on the same order of magnitude as that of tropical and temperate regions, despite a lower absolute magnitude of soil respiration in boreal regions. The estimated annual global heterotrophic respiration and global autotrophic respiration were 51 and 40 Pg C yr-1, respectively. The global soil respiration responded to the increase in air temperature at the rate of 3.3 Pg C yr-1 °C-1, and Q10 = 1.4. Our study scaled up observed soil respiration values from field measurements to estimate global soil respiration and provide a data-oriented estimate of global soil respiration. The estimates are based on a semi-empirical model parameterized with over one thousand data points. Our analysis indicates that the climate controls on soil respiration may translate into an increasing trend in global soil respiration and our analysis emphasizes the relevance of the soil carbon flux from soil to

  14. NEAMS Software Licensing, Release, and Distribution: Implications for FY2013 Work Package Planning

    International Nuclear Information System (INIS)

    Bernholdt, David E.

    2012-01-01

    The vision of the NEAMS program is to bring truly predictive modeling and simulation (M and S) capabilities to the nuclear engineering community in order to enable a new approach to the analysis of nuclear systems. NEAMS anticipates issuing in FY 2018 a full release of its computational 'Fermi Toolkit' aimed at advanced reactor and fuel cycles. The NEAMS toolkit involves extensive software development activities, some of which have already been underway for several years, however, the Advanced Modeling and Simulation Office (AMSO), which sponsors the NEAMS program, has not yet issued any official guidance regarding software licensing, release, and distribution policies. This motivated an FY12 task in the Capability Transfer work package to develop and recommend an appropriate set of policies. The current preliminary report is intended to provide awareness of issues with implications for work package planning for FY13. We anticipate a small amount of effort associated with putting into place formal licenses and contributor agreements for NEAMS software which doesn't already have them. We do not anticipate any additional effort or costs associated with software release procedures or schedules beyond those dictated by the quality expectations for the software. The largest potential costs we anticipate would be associated with the setup and maintenance of shared code repositories for development and early access to NEAMS software products. We also anticipate an opportunity, with modest associated costs, to work with the Radiation Safety Information Computational Center (RSICC) to clarify export control assessment policies for software under development.

  15. Longitudinal observations of globally distributed design teams: The impacts on Product Development

    DEFF Research Database (Denmark)

    Taylor, Thomas Paul; Ahmed-Kristensen, Saeema

    2015-01-01

    Factors impacting the success of Product Development (PD) projects are intensified when teams are distributed globally, making it a challenging task for project management to deal with effects on time, cost and quality. It is important for project management to understand when challenges......, such as communication difficulties, a lack of common vision between team members or issues related to documentation, may occur during PD projects, enabling them to take the necessary preventative action (Edmondson and Nembhard, 2009). When investigating factors impacting the success of PD, the majority of research...... studies of globally distributed design teams in PD projects. This paper aims to contribute to the further understanding of the factors impacting the success of PD projects when teams are distributed globally. With the results from a longitudinal observational study over 8 months, the factors impacting...

  16. Free software, Open source software, licenses. A short presentation including a procedure for research software and data dissemination

    OpenAIRE

    Gomez-Diaz , Teresa

    2014-01-01

    4 pages. Spanish version: Software libre, software de código abierto, licencias. Donde se propone un procedimiento de distribución de software y datos de investigación; The main goal of this document is to help the research community to understand the basic concepts of software distribution: Free software, Open source software, licenses. This document also includes a procedure for research software and data dissemination.

  17. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    Science.gov (United States)

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  18. Data acquisition software for the CMS strip tracker

    International Nuclear Information System (INIS)

    Bainbridge, R; Cripps, N; Fulcher, J; Radicci, V; Wingham, M; Baulieu, G; Bel, S; Delaere, C; Drouhin, F; Gill, K; Mirabito, L; Cole, J; Jesus, A C A; Giassi, A; Giordano, D; Gross, L; Hahn, K; Mersi, S; Nikolic, M; Tkaczyk, S

    2008-01-01

    The CMS silicon strip tracker, providing a sensitive area of approximately 200 m 2 and comprising 10 million readout channels, has recently been completed at the tracker integration facility at CERN. The strip tracker community is currently working to develop and integrate the online and offline software frameworks, known as XDAQ and CMSSW respectively, for the purposes of data acquisition and detector commissioning and monitoring. Recent developments have seen the integration of many new services and tools within the online data acquisition system, such as event building, online distributed analysis, an online monitoring framework, and data storage management. We review the various software components that comprise the strip tracker data acquisition system, the software architectures used for stand-alone and global data-taking modes. Our experiences in commissioning and operating one of the largest ever silicon micro-strip tracking systems are also reviewed

  19. Cloud manufacturing distributed computing technologies for global and sustainable manufacturing

    CERN Document Server

    Mehnen, Jörn

    2013-01-01

    Global networks, which are the primary pillars of the modern manufacturing industry and supply chains, can only cope with the new challenges, requirements and demands when supported by new computing and Internet-based technologies. Cloud Manufacturing: Distributed Computing Technologies for Global and Sustainable Manufacturing introduces a new paradigm for scalable service-oriented sustainable and globally distributed manufacturing systems.   The eleven chapters in this book provide an updated overview of the latest technological development and applications in relevant research areas.  Following an introduction to the essential features of Cloud Computing, chapters cover a range of methods and applications such as the factors that actually affect adoption of the Cloud Computing technology in manufacturing companies and new geometrical simplification method to stream 3-Dimensional design and manufacturing data via the Internet. This is further supported case studies and real life data for Waste Electrical ...

  20. Nuclear model codes and related software distributed by the OECD/NEA Data Bank

    International Nuclear Information System (INIS)

    Sartori, E.

    1993-01-01

    Software and data for nuclear energy applications is acquired, tested and distributed by several information centres; in particular, relevant computer codes are distributed internationally by the OECD/NEA Data Bank (France) and by ESTSC and EPIC/RSIC (United States). This activity is coordinated among the centres and is extended outside the OECD area through an arrangement with the IAEA. This article covers more specifically the availability of nuclear model codes and also those codes which further process their results into data sets needed for specific nuclear application projects. (author). 2 figs

  1. The Newcastle connection: A software subsystem for constructing distributed UNIX systems

    International Nuclear Information System (INIS)

    Randell, B.

    1985-01-01

    The Newcastle connection is a software subsystem that can be added to each of a set of physically interconnected UNIX or UNIX look-alike systems, so as to construct a distributed system which is functionally indistinguishable at both the user and the program level from a conventional single-processor UNIX system. The techniques used are applicable to a variety and multiplicity of both local and wide area networks, and enable all issues of inter-processor communication, network protocols, etc., to be hidden. A brief account is given of experience with such distributed systems, the first of which was constructed in 1982 using a set of PDP11s running UNIX Version 7, and connected by a Cambridge Ring - since this date the Connection has been used to construct distributed systems based on various other computers and versions of UNIX, both at Newcastle and elsewhere. The final sections compare our scheme to various precursor schemes and discuss its potential relevance to other operating systems. (orig.)

  2. The social disutility of software ownership.

    Science.gov (United States)

    Douglas, David M

    2011-09-01

    Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.

  3. Distributed Arithmetic for Efficient Base-Band Processing in Real-Time GNSS Software Receivers

    Directory of Open Access Journals (Sweden)

    Grégoire Waelchli

    2010-01-01

    Full Text Available The growing market of GNSS capable mobile devices is driving the interest of GNSS software solutions, as they can share many system resources (processor, memory, reducing both the size and the cost of their integration. Indeed, with the increasing performance of modern processors, it becomes now feasible to implement in software a multichannel GNSS receiver operating in real time. However, a major issue with this approach is the large computing resources required for the base-band processing, in particular for the correlation operations. Therefore, new algorithms need to be developed in order to reduce the overall complexity of the receiver architecture. Towards that aim, this paper first introduces the challenges of the software implementation of a GPS receiver, with a main focus given to the base-band processing and correlation operations. It then describes the already existing solutions and, from this, introduces a new algorithm based on distributed arithmetic.

  4. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model.

    Science.gov (United States)

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit

    2017-02-01

    To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.

  5. 32Si as natural tracer : measurement, global distribution and application

    International Nuclear Information System (INIS)

    Morgenstern, U.

    1997-01-01

    Cosmogenic 32 Si (half-life 140 years) can be applied to the study of environmental circulation processes in the time range of the last 1000 years, a key period for modelling past climate change. Its non-gaseous nature and fairly constant production rate are favourable to quantifying its roe in environmental processes. Applications of 32 Si method were limited due to uncertainties in half-life and poor knowledge of its global distribution, and to its very small natural concentration. Recent developments concerning these problems will be presented with special emphasis to measurement and global distribution, and to application in study of groundwater recharge and flow, glacier dynamics, soil erosion rates and sedimentation in lakes and oceans. (author)

  6. Why Replacing Legacy Systems Is So Hard in Global Software Development: An Information Infrastructure Perspective

    DEFF Research Database (Denmark)

    Matthiesen, Stina; Bjørn, Pernille

    2015-01-01

    We report on an ethnographic study of an outsourcing global software development (GSD) setup between a Danish IT company and an Indian IT vendor developing a system to replace a legacy system for social services administration in Denmark. Physical distance and GSD collaboration issues tend...... to be obvious explanations for why GSD tasks fail to reach completion; however, we account for the difficulties within the technical nature of software system task. We use the framework of information infrastructure to show how replacing a legacy system in governmental information infrastructures includes...... the work of tracing back to knowledge concerning law, technical specifications, as well as how information infrastructures have dynamically evolved over time. Not easily carried out in a GSD setup is the work around technical tasks that requires careful examination of mundane technical aspects, standards...

  7. The Global Climate Dashboard: a Software Interface to Stream Comprehensive Climate Data

    Science.gov (United States)

    Gardiner, N.; Phillips, M.; NOAA Climate Portal Dashboard

    2011-12-01

    The Global Climate Dashboard is an integral component of NOAA's web portal to climate data, services, and value-added content for decision-makers, teachers, and the science-attentive public (www.clmate.gov). The dashboard provides a rapid view of observational data that demonstrate climate change and variability, as well as outputs from the Climate Model Intercomparison Project version 3, which was built to support the Intergovernmental Panel on Climate Change fourth assessment. The data shown in the dashboard therefore span a range of climate science disciplines with applications that serve audiences with diverse needs. The dashboard is designed with reusable software components that allow it to be implemented incrementally on a wide range of platforms including desktops, tablet devices, and mobile phones. The underlying software components support live streaming of data and provide a way of encapsulating graph sytles and other presentation details into a device-independent standard format that results in a common visual look and feel across all platforms. Here we describe the pedagogical objectives, technical implementation, and the deployment of the dashboard through climate.gov and partner web sites and describe plans to develop a mobile application using the same framework.

  8. Global pyrogeography: the current and future distribution of wildfire.

    Directory of Open Access Journals (Sweden)

    Meg A Krawchuk

    Full Text Available Climate change is expected to alter the geographic distribution of wildfire, a complex abiotic process that responds to a variety of spatial and environmental gradients. How future climate change may alter global wildfire activity, however, is still largely unknown. As a first step to quantifying potential change in global wildfire, we present a multivariate quantification of environmental drivers for the observed, current distribution of vegetation fires using statistical models of the relationship between fire activity and resources to burn, climate conditions, human influence, and lightning flash rates at a coarse spatiotemporal resolution (100 km, over one decade. We then demonstrate how these statistical models can be used to project future changes in global fire patterns, highlighting regional hotspots of change in fire probabilities under future climate conditions as simulated by a global climate model. Based on current conditions, our results illustrate how the availability of resources to burn and climate conditions conducive to combustion jointly determine why some parts of the world are fire-prone and others are fire-free. In contrast to any expectation that global warming should necessarily result in more fire, we find that regional increases in fire probabilities may be counter-balanced by decreases at other locations, due to the interplay of temperature and precipitation variables. Despite this net balance, our models predict substantial invasion and retreat of fire across large portions of the globe. These changes could have important effects on terrestrial ecosystems since alteration in fire activity may occur quite rapidly, generating ever more complex environmental challenges for species dispersing and adjusting to new climate conditions. Our findings highlight the potential for widespread impacts of climate change on wildfire, suggesting severely altered fire regimes and the need for more explicit inclusion of fire in research

  9. Global pyrogeography: the current and future distribution of wildfire.

    Science.gov (United States)

    Krawchuk, Meg A; Moritz, Max A; Parisien, Marc-André; Van Dorn, Jeff; Hayhoe, Katharine

    2009-01-01

    Climate change is expected to alter the geographic distribution of wildfire, a complex abiotic process that responds to a variety of spatial and environmental gradients. How future climate change may alter global wildfire activity, however, is still largely unknown. As a first step to quantifying potential change in global wildfire, we present a multivariate quantification of environmental drivers for the observed, current distribution of vegetation fires using statistical models of the relationship between fire activity and resources to burn, climate conditions, human influence, and lightning flash rates at a coarse spatiotemporal resolution (100 km, over one decade). We then demonstrate how these statistical models can be used to project future changes in global fire patterns, highlighting regional hotspots of change in fire probabilities under future climate conditions as simulated by a global climate model. Based on current conditions, our results illustrate how the availability of resources to burn and climate conditions conducive to combustion jointly determine why some parts of the world are fire-prone and others are fire-free. In contrast to any expectation that global warming should necessarily result in more fire, we find that regional increases in fire probabilities may be counter-balanced by decreases at other locations, due to the interplay of temperature and precipitation variables. Despite this net balance, our models predict substantial invasion and retreat of fire across large portions of the globe. These changes could have important effects on terrestrial ecosystems since alteration in fire activity may occur quite rapidly, generating ever more complex environmental challenges for species dispersing and adjusting to new climate conditions. Our findings highlight the potential for widespread impacts of climate change on wildfire, suggesting severely altered fire regimes and the need for more explicit inclusion of fire in research on global

  10. Global study of nuclear modifications on parton distribution functions

    Directory of Open Access Journals (Sweden)

    Rong Wang

    2017-07-01

    Full Text Available A global analysis of nuclear medium modifications of parton distributions is presented using deeply inelastic scattering data of various nuclear targets. Two obtained data sets are provided for quark and gluon nuclear modification factors, referred as nIMParton16. One is from the global fit only to the experimental data of isospin-scalar nuclei (Set A, and the other is from the fit to all the measured nuclear data (Set B. The scale-dependence is described by DGLAP equations with nonlinear corrections in this work. The Fermi motion and off-shell effect, nucleon swelling, and parton–parton recombination are taken into account together for modeling the complicated x-dependence of nuclear modification. The nuclear gluon shadowing in this paper is dynamically generated by the QCD evolution of parton splitting and recombination processes with zero gluon density at the input scale. Sophisticated nuclear dependence of nuclear medium effects is studied with only two free parameters. With the obtained free parameters from the global analysis, the nuclear modifications of parton distribution functions of unmeasured nuclei can be predicted in our model. Nuclear modification of deuteron is also predicted and shown with recent measurement at JLab.

  11. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  12. Global mammal distributions, biodiversity hotspots, and conservation.

    Science.gov (United States)

    Ceballos, Gerardo; Ehrlich, Paul R

    2006-12-19

    Hotspots, which have played a central role in the selection of sites for reserves, require careful rethinking. We carried out a global examination of distributions of all nonmarine mammals to determine patterns of species richness, endemism, and endangerment, and to evaluate the degree of congruence among hotspots of these three measures of diversity in mammals. We then compare congruence of hotspots in two animal groups (mammals and birds) to assess the generality of these patterns. We defined hotspots as the richest 2.5% of cells in a global equal-area grid comparable to 1 degrees latitude x 1 degrees longitude. Hotspots of species richness, "endemism," and extinction threat were noncongruent. Only 1% of cells and 16% of species were common to the three types of mammalian hotspots. Congruence increased with increases in both the geographic scope of the analysis and the percentage of cells defined as being hotspots. The within-mammal hotspot noncongruence was similar to the pattern recently found for birds. Thus, assigning global conservation priorities based on hotspots is at best a limited strategy.

  13. Global exponential stability of mixed discrete and distributively delayed cellular neural network

    International Nuclear Information System (INIS)

    Yao Hong-Xing; Zhou Jia-Yan

    2011-01-01

    This paper concernes analysis for the global exponential stability of a class of recurrent neural networks with mixed discrete and distributed delays. It first proves the existence and uniqueness of the balance point, then by employing the Lyapunov—Krasovskii functional and Young inequality, it gives the sufficient condition of global exponential stability of cellular neural network with mixed discrete and distributed delays, in addition, the example is provided to illustrate the applicability of the result. (general)

  14. Spatial distribution of coefficients for determination of global radiation in Serbia

    Directory of Open Access Journals (Sweden)

    Nikolić Jugoslav L.

    2012-01-01

    Full Text Available The aim of this paper is a creation of the spatial distribution of the corresponding coefficients for the indirect determination of global radiation using all direct measurements data of this shortwave radiation balance component in Serbia in the standard climate period (1961-1990. Based on the global radiation direct measurements data recorded in the past and routine measurements/observations of cloudiness and sunshine duration, the spatial distribution coefficients maps required for calculation of global radiation were produced on the basis of sunshine/cloudiness in an arbitrary point on the territory of Serbia. Besides, a specific verification of the proposed empirical formula was performed. This paper contributes to a wide range of practical applications as direct measurements of global radiation are relatively rare, and are not carried out in Serbia today. Significant application is possible in the domain of renewable energy sources. The development of method for determination of the global radiation has an importance from the aspect of the environmental protection; however it also has an economic importance through applications in numerous commercial projects, as it does not require special measurements or additional financial investments.

  15. Sharp conditions for global stability of Lotka-Volterra systems with distributed delays

    Science.gov (United States)

    Faria, Teresa

    We give a criterion for the global attractivity of a positive equilibrium of n-dimensional non-autonomous Lotka-Volterra systems with distributed delays. For a class of autonomous Lotka-Volterra systems, we show that such a criterion is sharp, in the sense that it provides necessary and sufficient conditions for the global asymptotic stability independently of the choice of the delay functions. The global attractivity of positive equilibria is established by imposing a diagonal dominance of the instantaneous negative feedback terms, and relies on auxiliary results showing the boundedness of all positive solutions. The paper improves and generalizes known results in the literature, namely by considering systems with distributed delays rather than discrete delays.

  16. FORMATION OF DISTRIBUTION SYSTEMS WITH THE INVOLVEMENT OF THE GLOBAL RETAIL CHAINS

    Directory of Open Access Journals (Sweden)

    L. Kudyrko

    2015-08-01

    Full Text Available The purpose of the article is to analyze the features of formation and functioning of multi-marketing distribution systems involving global retail chains and identify the causes of conflicts between participants in the global supply chain and suggest possible ways to displace them. The article represents the author’s definition of the term “global retail chain”. The role and responsibilities of international retailers in the formation of marketing structures on international markets are discovered. The algorithm of the relationship between the participants in the traditional vertical marketing system with the definition of international components is determined. The conflict causes between the participants within the distribution channel are identified.

  17. Web Based Interactive Software in International Business: The Case of the Global Market Potential System Online (GMPSO[C])

    Science.gov (United States)

    Janavaras, Basil J.; Gomes, Emanuel; Young, Richard

    2008-01-01

    This paper seeks to confirm whether students using the Global Market Potential System Online (GMPSO) web based software, (http://globalmarketpotential.com), for their class project enhanced their knowledge and understanding of international business. The challenge most business instructors and practitioners face is to determine how to bring the…

  18. The global distribution of diet breadth in insect herbivores.

    Science.gov (United States)

    Forister, Matthew L; Novotny, Vojtech; Panorska, Anna K; Baje, Leontine; Basset, Yves; Butterill, Philip T; Cizek, Lukas; Coley, Phyllis D; Dem, Francesca; Diniz, Ivone R; Drozd, Pavel; Fox, Mark; Glassmire, Andrea E; Hazen, Rebecca; Hrcek, Jan; Jahner, Joshua P; Kaman, Ondrej; Kozubowski, Tomasz J; Kursar, Thomas A; Lewis, Owen T; Lill, John; Marquis, Robert J; Miller, Scott E; Morais, Helena C; Murakami, Masashi; Nickel, Herbert; Pardikes, Nicholas A; Ricklefs, Robert E; Singer, Michael S; Smilanich, Angela M; Stireman, John O; Villamarín-Cortez, Santiago; Vodka, Stepan; Volf, Martin; Wagner, David L; Walla, Thomas; Weiblen, George D; Dyer, Lee A

    2015-01-13

    Understanding variation in resource specialization is important for progress on issues that include coevolution, community assembly, ecosystem processes, and the latitudinal gradient of species richness. Herbivorous insects are useful models for studying resource specialization, and the interaction between plants and herbivorous insects is one of the most common and consequential ecological associations on the planet. However, uncertainty persists regarding fundamental features of herbivore diet breadth, including its relationship to latitude and plant species richness. Here, we use a global dataset to investigate host range for over 7,500 insect herbivore species covering a wide taxonomic breadth and interacting with more than 2,000 species of plants in 165 families. We ask whether relatively specialized and generalized herbivores represent a dichotomy rather than a continuum from few to many host families and species attacked and whether diet breadth changes with increasing plant species richness toward the tropics. Across geographic regions and taxonomic subsets of the data, we find that the distribution of diet breadth is fit well by a discrete, truncated Pareto power law characterized by the predominance of specialized herbivores and a long, thin tail of more generalized species. Both the taxonomic and phylogenetic distributions of diet breadth shift globally with latitude, consistent with a higher frequency of specialized insects in tropical regions. We also find that more diverse lineages of plants support assemblages of relatively more specialized herbivores and that the global distribution of plant diversity contributes to but does not fully explain the latitudinal gradient in insect herbivore specialization.

  19. Pascal software structures achieve definite control of the 24 MFTF sustaining neutral-beam power supplies

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    Precise control of large, complex systems is not assured unless there is known to be no unintended interactions in the control system. The software controlling the sustaining neutral-beam power supplies of the Mirror Fusion Test Facility accomplishes this feat. The software structures comprise some 16,000 lines of commented Pascal code, distributed amoung 10 different tasks. Each task may control any of the 24 power supplies. All the tasks are strictly event-driven, and are not subject to any system mode. Since there is no global information in the software, we know that all the power supplies are controlled independently

  20. The assessment of water loss from a damaged distribution pipe using the FEFLOW software

    Directory of Open Access Journals (Sweden)

    Iwanek Małgorzata

    2017-01-01

    Full Text Available Common reasons of real water loss in distribution systems are leakages caused by the failures or pipe breakages. Depending on the intensity of leakage from a damaged buried pipe, water can flow to the soil surface just after the failure occurs, much later or never at all. The localization of the place where the pipe breakage occurs is relatively easy when water outflow occurs on the soil surface. The volume of lost water strongly depends on the time it takes to localize the place of a pipe breakage. The aim of this paper was to predict the volume of water lost between the moment of a failure occurring and the moment of water outflow on the soil surface, during a prospective failure in a distribution system. The basis of the analysis was a numerical simulation of a water pipe failure using the FEFLOW v. 5.3 software (Finite Element subsurface FLOW systems for a real middle-sized distribution system. Simulations were conducted for variants depending on pipes’ diameter (80÷200 mm for minimal and maximal hydraulic pressure head in the system (20.14 and 60.41 m H2O, respectively. FEFLOW software application enabled to select places in the water system where possible failures would be difficult to detect.

  1. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  2. Global distribution and evolvement of urbanization and PM2.5 (1998-2015)

    Science.gov (United States)

    Yang, Dongyang; Ye, Chao; Wang, Xiaomin; Lu, Debin; Xu, Jianhua; Yang, Haiqing

    2018-06-01

    PM2.5 concentrations increased and have been one of the major social issues along with rapid urbanization in many regions of the world in recent decades. The development of urbanization differed among regions, PM2.5 pollution also presented discrepant distribution across the world. Thus, this paper aimed to grasp the profile of global distribution of urbanization and PM2.5 and their evolutionary relationships. Based on global data for the proportion of the urban population and PM2.5 concentrations in 1998-2015, this paper investigated the spatial distribution, temporal variation, and evolutionary relationships of global urbanization and PM2.5. The results showed PM2.5 presented an increasing trend along with urbanization during the study period, but there was a variety of evolutionary relationships in different countries and regions. Most countries in East Asia, Southeast Asia, South Asia, and some African countries developed with the rapid increase in both urbanization and PM2.5. Under the impact of other socioeconomic factors, such as industry and economic growth, the development of urbanization increased PM2.5 concentrations in most Asian countries and some African countries, but decreased PM2.5 concentrations in most European and American countries. The findings of this study revealed the spatial distributions of global urbanization and PM2.5 pollution and provided an interpretation on the evolution of urbanization-PM2.5 relationships, which can contribute to urbanization policies making aimed at successful PM2.5 pollution control and abatement.

  3. Analysing the Outbound logistics process enhancements in Nokia-Siemens Networks Global Distribution Center

    OpenAIRE

    Marjeta, Katri

    2011-01-01

    Marjeta, Katri. 2011. Analysing the outbound logistics process enhancements in Nokia-Siemens Networks Global Distribution Center. Master´s thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 57. Due to confidentiality issues, this work has been modified from its original form. The aim of this Master Thesis work is to describe and analyze the outbound logistics process enhancement projects executed in Nokia-Siemens Networks Global Distribution Center after the N...

  4. Spatiotemporal distribution and national measurement of the global carbonate carbon sink.

    Science.gov (United States)

    Li, Huiwen; Wang, Shijie; Bai, Xiaoyong; Luo, Weijun; Tang, Hong; Cao, Yue; Wu, Luhua; Chen, Fei; Li, Qin; Zeng, Cheng; Wang, Mingming

    2018-06-21

    The magnitudes, spatial distributions and contributions to global carbon budget of the global carbonate carbon sink (CCS) still remain uncertain, allowing the problem of national measurement of CCS remain unresolved which will directly influence the fairness of global carbon markets and emission trading. Here, based on high spatiotemporal resolution ecological, meteorological raster data and chemical field monitoring data, combining highly reliable machine learning algorithm with the thermodynamic dissolution equilibrium model, we estimated the new CCS of 0.89 ± 0.23 petagrams of carbon per year (Pg C yr -1 ), amounting to 74.50% of global net forest sink and accounting for 28.75% of terrestrial sinks or 46.81% of the missing sink. Our measurement for 142 nations of CCS showed that Russia, Canada, China and the USA contribute over half of the global CCS. We also presented the first global fluxes maps of the CCS with spatial resolution of 0.05°, exhibiting two peaks in equatorial regions (10°S to 10°N) and low latitudes (10°N to 35°N) in Northern Hemisphere. By contrast, there are no peaks in Southern Hemisphere. The greatest average carbon sink flux (CCSF), i.e., 2.12 tC ha -1  yr -1 , for 2000 to 2014 was contributed by tropical rainforest climate near the equator, and the smallest average CCSF was presented in tropical arid zones, showing a magnitude of 0.26 tC ha -1  yr -1 . This research estimated the magnitudes, spatial distributions, variations and contributions to the global carbon budget of the CCS in a higher spatiotemporal representativeness and expandability way, which, via multiple mechanisms, introduced an important sink in the terrestrial carbon sink system and the global missing sink and that can help us further reveal and support our understanding of global rock weathering carbon sequestration, terrestrial carbon sink system and global carbon cycle dynamics which make our understanding of global change more comprehensive

  5. When Distribution of Tasks and Skills Are Fundamentally Problematic

    DEFF Research Database (Denmark)

    Matthiesen, Stina; Bjørn, Pernille

    2017-01-01

    within a global software project, which relied heavily on feedback from mundane project tools utilized for everyday coordination and monitoring. Our study reveals that these tools hid serious issues relating to both the distribution of sociotechnical skills and a discharge of accountability in task...

  6. ATLAS Distributed Computing

    CERN Document Server

    Schovancova, J; The ATLAS collaboration

    2011-01-01

    The poster details the different aspects of the ATLAS Distributed Computing experience after the first year of LHC data taking. We describe the performance of the ATLAS distributed computing system and the lessons learned during the 2010 run, pointing out parts of the system which were in a good shape, and also spotting areas which required improvements. Improvements ranged from hardware upgrade on the ATLAS Tier-0 computing pools to improve data distribution rates, tuning of FTS channels between CERN and Tier-1s, and studying data access patterns for Grid analysis to improve the global processing rate. We show recent software development driven by operational needs with emphasis on data management and job execution in the ATLAS production system.

  7. Tier-3 Monitoring Software Suite (T3MON) proposal

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Klimentov, A; Korenkov, V; Oleynik, D; Panitkin, S; Petrosyan, A

    2011-01-01

    The ATLAS Distributed Computing activities concentrated so far in the “central” part of the computing system of the experiment, namely the first 3 tiers (CERN Tier0, the 10 Tier1s centres and the 60+ Tier2s). This is a coherent system to perform data processing and management on a global scale and host (re)processing, simulation activities down to group and user analysis. Many ATLAS Institutes and National Communities built (or have plans to build) Tier-3 facilities. The definition of Tier-3 concept has been outlined (REFERENCE). Tier-3 centres consist of non-pledged resources mostly dedicated for the data analysis by the geographically close or local scientific groups. Tier-3 sites comprise a range of architectures and many do not possess Grid middleware, which would render application of Tier-2 monitoring systems useless. This document describes a strategy to develop a software suite for monitoring of the Tier3 sites. This software suite will enable local monitoring of the Tier3 sites and the global vie...

  8. A QDWH-Based SVD Software Framework on Distributed-Memory Manycore Systems

    KAUST Repository

    Sukkari, Dalal

    2017-01-01

    This paper presents a high performance software framework for computing a dense SVD on distributed- memory manycore systems. Originally introduced by Nakatsukasa et al. (Nakatsukasa et al. 2010; Nakatsukasa and Higham 2013), the SVD solver relies on the polar decomposition using the QR Dynamically-Weighted Halley algorithm (QDWH). Although the QDWH-based SVD algorithm performs a significant amount of extra floating-point operations compared to the traditional SVD with the one-stage bidiagonal reduction, the inherent high level of concurrency associated with Level 3 BLAS compute-bound kernels ultimately compensates for the arithmetic complexity overhead. Using the ScaLAPACK two-dimensional block cyclic data distribution with a rectangular processor topology, the resulting QDWH-SVD further reduces excessive communications during the panel factorization, while increasing the degree of parallelism during the update of the trailing submatrix, as opposed to relying to the default square processor grid. After detailing the algorithmic complexity and the memory footprint of the algorithm, we conduct a thorough performance analysis and study the impact of the grid topology on the performance by looking at the communication and computation profiling trade-offs. We report performance results against state-of-the-art existing QDWH software implementations (e.g., Elemental) and their SVD extensions on large-scale distributed-memory manycore systems based on commodity Intel x86 Haswell processors and Knights Landing (KNL) architecture. The QDWH-SVD framework achieves up to 3/8-fold on the Haswell/KNL-based platforms, respectively, against ScaLAPACK PDGESVD and turns out to be a competitive alternative for well and ill-conditioned matrices. We finally come up herein with a performance model based on these empirical results. Our QDWH-based polar decomposition and its SVD extension are freely available at https://github.com/ecrc/qdwh.git and https

  9. Global carbon monoxide vertical distributions from spaceborne high-resolution FTIR nadir measurements

    Directory of Open Access Journals (Sweden)

    B. Barret

    2005-01-01

    Full Text Available This paper presents the first global distributions of CO vertical profiles retrieved from a thermal infrared FTS working in the nadir geometry. It is based on the exploitation of the high resolution and high quality spectra measured by the Interferometric Monitor of Greenhouse gases (IMG which flew onboard the Japanese ADEOS platform in 1996-1997. The retrievals are performed with an algorithm based on the Optimal Estimation Method (OEM and are characterized in terms of vertical sensitivity and error budget. It is found that most of the IMG measurements contain between 1.5 and 2.2 independent pieces of information about the vertical distribution of CO from the lower troposphere to the upper troposphere-lower stratosphere (UTLS. The retrievals are validated against coincident NOAA/CMDL in situ surface measurements and NDSC/FTIR total columns measurements. The retrieved global distributions of CO are also found to be in good agreement with the distributions modeled by the GEOS-CHEM 3D CTM, highlighting the ability of IMG to capture the horizontal as well as the vertical structure of the CO distributions.

  10. Managing distributed software development in the Virtual Astronomical Observatory

    Science.gov (United States)

    Evans, Janet D.; Plante, Raymond L.; Boneventura, Nina; Busko, Ivo; Cresitello-Dittmar, Mark; D'Abrusco, Raffaele; Doe, Stephen; Ebert, Rick; Laurino, Omar; Pevunova, Olga; Refsdal, Brian; Thomas, Brian

    2012-09-01

    The U.S. Virtual Astronomical Observatory (VAO) is a product-driven organization that provides new scientific research capabilities to the astronomical community. Software development for the VAO follows a lightweight framework that guides development of science applications and infrastructure. Challenges to be overcome include distributed development teams, part-time efforts, and highly constrained schedules. We describe the process we followed to conquer these challenges while developing Iris, the VAO application for analysis of 1-D astronomical spectral energy distributions (SEDs). Iris was successfully built and released in less than a year with a team distributed across four institutions. The project followed existing International Virtual Observatory Alliance inter-operability standards for spectral data and contributed a SED library as a by-product of the project. We emphasize lessons learned that will be folded into future development efforts. In our experience, a well-defined process that provides guidelines to ensure the project is cohesive and stays on track is key to success. Internal product deliveries with a planned test and feedback loop are critical. Release candidates are measured against use cases established early in the process, and provide the opportunity to assess priorities and make course corrections during development. Also key is the participation of a stakeholder such as a lead scientist who manages the technical questions, advises on priorities, and is actively involved as a lead tester. Finally, frequent scheduled communications (for example a bi-weekly tele-conference) assure issues are resolved quickly and the team is working toward a common vision.

  11. Phosphorus in agricultural soils: drivers of its distribution at the global scale

    Energy Technology Data Exchange (ETDEWEB)

    Ringeval, Bruno [ISPA, Villenave d' Ornon (France); Augusto, Laurent [ISPA, Villenave d' Ornon (France); Monod, Herve [Univ. Paris-Saclay, Jouy-en-Josas (France); van Apeldoorn, Dirk [Utrecht Univ., Utrecht (The Netherlands); Bouwman, Lex [Utrecht Univ., Utrecht (The Netherlands); Yang, Xiaojuan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Achat, David L. [ISPA, Villenave d' Ornon (France); Chini, Louise P. [Univ. of Maryland, College Park, MD (United States); Van Oost, Kristof [Univ. Catholique de Louvain, Louvain-la-Neuve (Belgium); Guenet, Bertrand [Univ. Paris-Saclay, Gif-sur-Yvette (France); Wang, Rong [Univ. Paris-Saclay, Gif-sur-Yvette (France); Peking Univ., Beijing (China); Decharme, Bertrand [CNRS/Meteo-France, Toulouse (France); Nesme, Thomas [ISPA, Villenave d' Ornon (France); Pellerin, Sylvain [ISPA, Villenave d' Ornon (France)

    2017-01-09

    Phosphorus (P) availability in soils limits crop yields in many regions of the world, while excess of soil P triggers aquatic eutrophication in other regions. Numerous processes drive the global spatial distribution of P in agricultural soils, but their relative roles remain unclear. Here, we combined several global datasets describing these drivers with a soil P dynamics model to simulate the distribution of P in agricultural soils and to assess the contributions of the different drivers at the global scale. We analyzed both the labile inorganic P (PILAB), a proxy of the pool involved in plant nutrition and the total soil P (PTOT). We found that the soil biogeochemical background (BIOG) and farming practices (FARM) were the main drivers of the spatial variability in cropland soil P content but that their contribution varied between PTOT vs PILAB. Indeed, 97% of the PTOT spatial variability could be explained by BIOG, while BIOG and FARM explained 41% and 58% of PILAB spatial variability, respectively. Other drivers such as climate, soil erosion, atmospheric P deposition and soil buffering capacity made only very small contribution. Lastly, our study is a promising approach to investigate the potential effect of P as a limiting factor for agricultural ecosystems and for global food production. Additionally, we quantified the anthropogenic perturbation of P cycle and demonstrated how the different drivers are combined to explain the global distribution of agricultural soil P.

  12. Progressive retry for software error recovery in distributed systems

    Science.gov (United States)

    Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.

    1993-01-01

    In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.

  13. Open Source Software The Challenge Ahead

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    The open source community has done amazingly well in terms of challenging the historical epicenter of computing - the supercomputer and data center - and driving change there. Linux now represents a healthy and growing share of infrastructure in large organisations globally. Apache and other infrastructural components have established the new de facto standard for software in the back office: freedom. It would be easy to declare victory. But the real challenge lies ahead - taking free software to the mass market, to your grandparents, to your nieces and nephews, to your friends. This is the next wave, and if we are to be successful we need to articulate the audacious goals clearly and loudly - because that's how the community process works best. Speaker Bio: Mark Shuttleworth founded the Ubuntu Project in early 2004. Ubuntu is an enterprise Linux distribution that is freely available worldwide and has both desktop and enterprise server editions. Mark studied finance and information technology at the Universit...

  14. Software Architecture Coupling Metric for Assessing Operational Responsiveness of Trading Systems

    Directory of Open Access Journals (Sweden)

    Claudiu VINTE

    2012-01-01

    Full Text Available The empirical observation that motivates our research relies on the difficulty to assess the performance of a trading architecture beyond a few synthetic indicators like response time, system latency, availability or volume capacity. Trading systems involve complex software architectures of distributed resources. However, in the context of a large brokerage firm, which offers a global coverage from both, market and client perspectives, the term distributed gains a critical significance indeed. Offering a low latency ordering system by nowadays standards is relatively easily achievable, but integrating it in a flexible manner within the broader information system architecture of a broker/dealer requires operational aspects to be factored in. We propose a metric for measuring the coupling level within software architecture, and employ it to identify architectural designs that can offer a higher level of operational responsiveness, which ultimately would raise the overall real-world performance of a trading system.

  15. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  16. Global body posture and plantar pressure distribution in individuals with and without temporomandibular disorder: a preliminary study.

    Science.gov (United States)

    Souza, Juliana A; Pasinato, Fernanda; Corrêa, Eliane C R; da Silva, Ana Maria T

    2014-01-01

    The aim of this study was to evaluate body posture and the distribution of plantar pressure at physiologic rest of the mandible and during maximal intercuspal positions in subjects with and without temporomandibular disorder (TMD). Fifty-one subjects were assessed by the Diagnostic Criteria for Research on Temporomandibular Disorders and divided into a symptomatic group (21) and an asymptomatic group (30). Postural analysis for both groups was conducted using photogrammetry (SAPo version 0.68; University of São Paulo, São Paulo, Brazil). The distribution of plantar pressures was evaluated by means of baropodometry (Footwork software), at physiologic rest and maximal intercuspal positions. Of 18 angular measurements, 3 (17%) were statistically different between the groups in photogrammetric evaluation. The symptomatic group showed more pronounced cervical distance (P = .0002), valgus of the right calcaneus (P = .0122), and lower pelvic tilt (P = .0124). The baropodometry results showed the TMD subjects presented significantly higher rearfoot and lower forefoot distribution than those in the asymptomatic group. No differences were verified in maximal intercuspal position in the between-group analysis and between the 2 mandibular positions in the within-group analysis. Subjects with and without TMD presented with global body posture misalignment. Postural changes were more pronounced in the subjects with TMD. In addition, symptomatic subjects presented with abnormal plantar pressure distribution, suggesting that TMD may have an influence on the postural system. Copyright © 2014 National University of Health Sciences. Published by Elsevier Inc. All rights reserved.

  17. PROGRAMMING OF METHODS FOR THE NEEDS OF LOGISTICS DISTRIBUTION SOLVING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Andrea Štangová

    2014-06-01

    Full Text Available Logistics has become one of the dominant factors which is affecting the successful management, competitiveness and mentality of the global economy. Distribution logistics materializes the connesciton of production and consumer marke. It uses different methodology and methods of multicriterial evaluation and allocation. This thesis adresses the problem of the costs of securing the distribution of product. It was therefore relevant to design a software product thet would be helpful in solvin the problems related to distribution logistics. Elodis – electronic distribution logistics program was designed on the basis of theoretical analysis of the issue of distribution logistics and on the analysis of the software products market. The program uses a multicriterial evaluation methods to deremine the appropriate type and mathematical and geometrical method to determine an appropriate allocation of the distribution center, warehouse and company.

  18. Distributed team cohesion – not an oxymoron. The impact of information and communications technologies on teamness in globally distributed IT projects

    Directory of Open Access Journals (Sweden)

    Olga Stawnicza

    2015-01-01

    Full Text Available Globally distributed IT projects are common practice in today’s globalized world. Typically, project team members’ work on interdependent tasks, with a common goal to be achieved as one team. However, being split between multiple locations impedes communication among team members and hampers the development of trust. Information and communications media enable communication between geographically distributed project team members and help to create and maintain trust within project units. Communication and trust are particularly significant for fostering a feeling of oneness among project team members. Oneness, also referred to as “teamness”, is repeatedly mentioned as one of the challenges facing global project teams. However, prior literature on teamness is very scarce and its importance is underrepresented. This research contributes to the field in two ways. First, the theoretical study based on a systematic literature review examines available evidence of teamness in globally distributed projects. Secondly, an empirical study based on interviews conducted with global project managers fills the current gap in literature on the link between use of ICT and establishing a sense of team unity. This paper draws practitioners’ attention to the importance of striving for teamness in spite of the geographical distance that exists between project team members.

  19. Spatio-temporal distribution of global solar radiation for Mexico using GOES data

    Science.gov (United States)

    Bonifaz, R.; Cuahutle, M.; Valdes, M.; Riveros, D.

    2013-05-01

    Increased need of sustainable and renewable energies around the world requires studies about the amount and distribution of such types of energies. Global solar radiation distribution in space and time is a key component on order to know the availability of the energy for different applications. Using GOES hourly data, the heliosat model was implemented for Mexico. Details about the model and its components are discussed step by stem an once obtained the global solar radiation images, different time datasets (hourly, daily, monthly and seasonal) were built in order to know the spatio-temporal behavior of this type of energy. Preliminary maps of the available solar global radiation energy for Mexico are presented, the amount and variation of the solar radiation by regions are analyzed and discussed. Future work includes a better parametrization of the model using calibrated ground stations data and more use of more complex models for better results.

  20. Mapping the global distribution of livestock.

    Science.gov (United States)

    Robinson, Timothy P; Wint, G R William; Conchedda, Giulia; Van Boeckel, Thomas P; Ercoli, Valentina; Palamara, Elisa; Cinardi, Giuseppina; D'Aietti, Laura; Hay, Simon I; Gilbert, Marius

    2014-01-01

    Livestock contributes directly to the livelihoods and food security of almost a billion people and affects the diet and health of many more. With estimated standing populations of 1.43 billion cattle, 1.87 billion sheep and goats, 0.98 billion pigs, and 19.60 billion chickens, reliable and accessible information on the distribution and abundance of livestock is needed for a many reasons. These include analyses of the social and economic aspects of the livestock sector; the environmental impacts of livestock such as the production and management of waste, greenhouse gas emissions and livestock-related land-use change; and large-scale public health and epidemiological investigations. The Gridded Livestock of the World (GLW) database, produced in 2007, provided modelled livestock densities of the world, adjusted to match official (FAOSTAT) national estimates for the reference year 2005, at a spatial resolution of 3 minutes of arc (about 5×5 km at the equator). Recent methodological improvements have significantly enhanced these distributions: more up-to date and detailed sub-national livestock statistics have been collected; a new, higher resolution set of predictor variables is used; and the analytical procedure has been revised and extended to include a more systematic assessment of model accuracy and the representation of uncertainties associated with the predictions. This paper describes the current approach in detail and presents new global distribution maps at 1 km resolution for cattle, pigs and chickens, and a partial distribution map for ducks. These digital layers are made publically available via the Livestock Geo-Wiki (http://www.livestock.geo-wiki.org), as will be the maps of other livestock types as they are produced.

  1. Aging transition in systems of oscillators with global distributed-delay coupling.

    Science.gov (United States)

    Rahman, B; Blyuss, K B; Kyrychko, Y N

    2017-09-01

    We consider a globally coupled network of active (oscillatory) and inactive (nonoscillatory) oscillators with distributed-delay coupling. Conditions for aging transition, associated with suppression of oscillations, are derived for uniform and gamma delay distributions in terms of coupling parameters and the proportion of inactive oscillators. The results suggest that for the uniform distribution increasing the width of distribution for the same mean delay allows aging transition to happen for a smaller coupling strength and a smaller proportion of inactive elements. For gamma distribution with sufficiently large mean time delay, it may be possible to achieve aging transition for an arbitrary proportion of inactive oscillators, as long as the coupling strength lies in a certain range.

  2. Federated software defined network operations for LHC experiments

    Science.gov (United States)

    Kim, Dongkyun; Byeon, Okhwan; Cho, Kihyeon

    2013-09-01

    The most well-known high-energy physics collaboration, the Large Hadron Collider (LHC), which is based on e-Science, has been facing several challenges presented by its extraordinary instruments in terms of the generation, distribution, and analysis of large amounts of scientific data. Currently, data distribution issues are being resolved by adopting an advanced Internet technology called software defined networking (SDN). Stability of the SDN operations and management is demanded to keep the federated LHC data distribution networks reliable. Therefore, in this paper, an SDN operation architecture based on the distributed virtual network operations center (DvNOC) is proposed to enable LHC researchers to assume full control of their own global end-to-end data dissemination. This may achieve an enhanced data delivery performance based on data traffic offloading with delay variation. The evaluation results indicate that the overall end-to-end data delivery performance can be improved over multi-domain SDN environments based on the proposed federated SDN/DvNOC operation framework.

  3. A Systematic Mapping Study of Tools for Distributed Software Development Teams

    DEFF Research Database (Denmark)

    Tell, Paolo; Ali Babar, Muhammad

    schemas for providing a framework that can help identify the categories that have attracted significant amount of research and commercial efforts, and the research areas where there are gaps to be filled. Conclusions: The findings show that whilst commercial and open source solutions are predominantly...... gaps. Objective: The objective of this research is to systematically identify and classify a comprehensive list of the technologies that have been developed and/or used for supporting GSD teams. Method: This study has been undertaken as a Systematic Mapping Study (SMS). Our searches identified 1958......Context: A wide variety of technologies have been developed to support Global Software Development (GSD). However, the information about the dozens of available solutions is quite diverse and scattered making it quite difficult to have an overview able to identify common trends and unveil research...

  4. Conversion and distribution of bibliographic information for further use on microcomputers with database software such as CDS/ISIS

    International Nuclear Information System (INIS)

    Nieuwenhuysen, P.; Besemer, H.

    1990-05-01

    This paper describes methods to work on microcomputers with data obtained from bibliographic and related databases distributed by online data banks, on CD-ROM or on tape. Also, we mention some user reactions to this technique. We list the different types of software needed to perform these services. Afterwards, we report about our development of software, to convert data so that they can be entered into UNESCO's program named CDS/ISIS (Version 2.3) for local database management on IBM microcomputers or compatibles; this software allows the preservation of the structure of the source data in records, fields, subfields and field occurrences. (author). 10 refs, 1 fig

  5. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  6. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  7. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2012-08-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  8. Terminological recommendations for software localization

    Directory of Open Access Journals (Sweden)

    Klaus-Dirk Schmitz

    2009-03-01

    Full Text Available After an explosive growth of data processing and software starting at the beginning of the 1980s, the software industry shifted toward a strong orientation in non-US markets at the beginning of the 1990s. Today we see the global marketing of software in almost all regions of the world. Since software is no longer used by IT experts only, and since European and national regulations require user interfaces, manuals and documentation to be provided in the language of the customer, the market for software translation, i.e. for software localization, is the fastest growing market in the translation business.

  9. Opportunities drive the global distribution of protected areas

    Directory of Open Access Journals (Sweden)

    Germán Baldi

    2017-02-01

    Full Text Available Background Protected areas, regarded today as a cornerstone of nature conservation, result from an array of multiple motivations and opportunities. We explored at global and regional levels the current distribution of protected areas along biophysical, human, and biological gradients, and assessed to what extent protection has pursued (i a balanced representation of biophysical environments, (ii a set of preferred conditions (biological, spiritual, economic, or geopolitical, or (iii existing opportunities for conservation regardless of any representation or preference criteria. Methods We used histograms to describe the distribution of terrestrial protected areas along biophysical, human, and biological independent gradients and linear and non-linear regression and correlation analyses to describe the sign, shape, and strength of the relationships. We used a random forest analysis to rank the importance of different variables related to conservation preferences and opportunity drivers, and an evenness metric to quantify representativeness. Results We find that protection at a global level is primarily driven by the opportunities provided by isolation and a low population density (variable importance = 34.6 and 19.9, respectively. Preferences play a secondary role, with a bias towards tourism attractiveness and proximity to international borders (variable importance = 12.7 and 3.4, respectively. Opportunities shape protection strongly in “North America & Australia–NZ” and “Latin America & Caribbean,” while the importance of the representativeness of biophysical environments is higher in “Sub-Saharan Africa” (1.3 times the average of other regions. Discussion Environmental representativeness and biodiversity protection are top priorities in land conservation agendas. However, our results suggest that they have been minor players driving current protection at both global and regional levels. Attempts to increase their relevance will

  10. Software and the future of programming languages.

    Science.gov (United States)

    Aho, Alfred V

    2004-02-27

    Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.

  11. A global survey of the distribution of free gas in marine sediments

    Science.gov (United States)

    Fleischer, Peter; Orsi, Tim; Richardson, Michael

    2003-10-01

    Following the work of Aubrey Anderson in the Gulf of Mexico, we have attempted to quantify the global distribution of free gas in shallow marine sediments, and have identified and indexed over one hundred documented cases in the scientific and engineering literature. Our survey confirms previous assumptions, primarily that gas bubbles are ubiquitous in the organic-rich muds of coastal waters and shallow adjacent seas. Acoustic turbidity as recorded during seismo-acoustic surveys is the most frequently cited evidence used to infer the presence of seafloor gas. Biogenic methane predominates within these shallow subbottom deposits. The survey also reveals significant imbalances in the geographic distribution of studies, which might be addressed in the future by accessing proprietary data or local studies with limited distribution. Because of their global prevalence, growing interest in gassy marine sediments is understandable as their presence has profound scientific, engineering and environmental implications.

  12. Global heating distributions for January 1979 calculated from GLA assimilated and simulated model-based datasets

    Science.gov (United States)

    Schaack, Todd K.; Lenzen, Allen J.; Johnson, Donald R.

    1991-01-01

    This study surveys the large-scale distribution of heating for January 1979 obtained from five sources of information. Through intercomparison of these distributions, with emphasis on satellite-derived information, an investigation is conducted into the global distribution of atmospheric heating and the impact of observations on the diagnostic estimates of heating derived from assimilated datasets. The results indicate a substantial impact of satellite information on diagnostic estimates of heating in regions where there is a scarcity of conventional observations. The addition of satellite data provides information on the atmosphere's temperature and wind structure that is important for estimation of the global distribution of heating and energy exchange.

  13. Path to 'Stardom' in Globally Distributed Hybrid Teams

    DEFF Research Database (Denmark)

    Sarker, Suprateek; Hove-Kirkeby, Sarah; Sarker, Saonee

    2011-01-01

    recognition that specific individuals within such teams are often critical to the team's performance. Consequently, existing knowledge about such teams may be enhanced by examining the factors that affect the performance of individual team members. This study attempts to address this need by identifying...... individuals who emerge as “stars” in globally distributed teams involved in knowledge work such as information systems development (ISD). Specifically, the study takes a knowledge-centered view in explaining which factors lead to “stardom” in such teams. Further, it adopts a social network approach consistent......Although distributed teams have been researched extensively in information systems and decision science disciplines, a review of the literature suggests that the dominant focus has been on understanding the factors affecting performance at the team level. There has however been an increasing...

  14. The leaf angle distribution of natural plant populations: assessing the canopy with a novel software tool.

    Science.gov (United States)

    Müller-Linow, Mark; Pinto-Espinosa, Francisco; Scharr, Hanno; Rascher, Uwe

    2015-01-01

    Three-dimensional canopies form complex architectures with temporally and spatially changing leaf orientations. Variations in canopy structure are linked to canopy function and they occur within the scope of genetic variability as well as a reaction to environmental factors like light, water and nutrient supply, and stress. An important key measure to characterize these structural properties is the leaf angle distribution, which in turn requires knowledge on the 3-dimensional single leaf surface. Despite a large number of 3-d sensors and methods only a few systems are applicable for fast and routine measurements in plants and natural canopies. A suitable approach is stereo imaging, which combines depth and color information that allows for easy segmentation of green leaf material and the extraction of plant traits, such as leaf angle distribution. We developed a software package, which provides tools for the quantification of leaf surface properties within natural canopies via 3-d reconstruction from stereo images. Our approach includes a semi-automatic selection process of single leaves and different modes of surface characterization via polygon smoothing or surface model fitting. Based on the resulting surface meshes leaf angle statistics are computed on the whole-leaf level or from local derivations. We include a case study to demonstrate the functionality of our software. 48 images of small sugar beet populations (4 varieties) have been analyzed on the base of their leaf angle distribution in order to investigate seasonal, genotypic and fertilization effects on leaf angle distributions. We could show that leaf angle distributions change during the course of the season with all varieties having a comparable development. Additionally, different varieties had different leaf angle orientation that could be separated in principle component analysis. In contrast nitrogen treatment had no effect on leaf angles. We show that a stereo imaging setup together with the

  15. Distributed Software-Attestation Defense against Sensor Worm Propagation

    Directory of Open Access Journals (Sweden)

    Jun-Won Ho

    2015-01-01

    Full Text Available Wireless sensor networks are vulnerable to sensor worm attacks in which the attacker compromises a few nodes and makes these compromised nodes initiate worm spread over the network, targeting the worm infection of the whole nodes in the network. Several defense mechanisms have been proposed to prevent worm propagation in wireless sensor networks. Although these proposed schemes use software diversity technique for worm propagation prevention under the belief that different software versions do not have common vulnerability, they have fundamental drawback in which it is difficult to realize the aforementioned belief in sensor motes. To resolve this problem, we propose on-demand software-attestation based scheme to defend against worm propagation in sensor network. The main idea of our proposed scheme is to perform software attestations against sensor nodes in on-demand manner and detect the infected nodes by worm, resulting in worm propagation block in the network. Through analysis, we show that our proposed scheme defends against worm propagation in efficient and robust manner. Through simulation, we demonstrate that our proposed scheme stops worm propagation at the reasonable overhead while preventing a majority of sensor nodes from being infected by worm.

  16. Tool support for distributed software engineering

    NARCIS (Netherlands)

    Spanjers, H.; Ter Huurne, M.; Bendas, D.; Graaf, B.; Lormans, M.; Van Solingen, R.

    2006-01-01

    Developing a software system in collaboration with other partners, and on different geographical locations is a big challenge for organizations. In this article we first discuss a system that automates build and test processes: SoftFab. This system has been successfully applied in practice in the

  17. Global Dynamics of Infectious Disease with Arbitrary Distributed Infectious Period on Complex Networks

    Directory of Open Access Journals (Sweden)

    Xiaoguang Zhang

    2014-01-01

    Full Text Available Most of the current epidemic models assume that the infectious period follows an exponential distribution. However, due to individual heterogeneity and epidemic diversity, these models fail to describe the distribution of infectious periods precisely. We establish a SIS epidemic model with multistaged progression of infectious periods on complex networks, which can be used to characterize arbitrary distributions of infectious periods of the individuals. By using mathematical analysis, the basic reproduction number R0 for the model is derived. We verify that the R0 depends on the average distributions of infection periods for different types of infective individuals, which extend the general theory obtained from the single infectious period epidemic models. It is proved that if R0<1, then the disease-free equilibrium is globally asymptotically stable; otherwise the unique endemic equilibrium exists such that it is globally asymptotically attractive. Finally numerical simulations hold for the validity of our theoretical results is given.

  18. The global distribution and dynamics of chromophoric dissolved organic matter.

    Science.gov (United States)

    Nelson, Norman B; Siegel, David A

    2013-01-01

    Chromophoric dissolved organic matter (CDOM) is a ubiquitous component of the open ocean dissolved matter pool, and is important owing to its influence on the optical properties of the water column, its role in photochemistry and photobiology, and its utility as a tracer of deep ocean biogeochemical processes and circulation. In this review, we discuss the global distribution and dynamics of CDOM in the ocean, concentrating on developments in the past 10 years and restricting our discussion to open ocean and deep ocean (below the main thermocline) environments. CDOM has been demonstrated to exert primary control on ocean color by its absorption of light energy, which matches or exceeds that of phytoplankton pigments in most cases. This has important implications for assessing the ocean biosphere via ocean color-based remote sensing and the evaluation of ocean photochemical and photobiological processes. The general distribution of CDOM in the global ocean is controlled by a balance between production (primarily microbial remineralization of organic matter) and photolysis, with vertical ventilation circulation playing an important role in transporting CDOM to and from intermediate water masses. Significant decadal-scale fluctuations in the abundance of global surface ocean CDOM have been observed using remote sensing, indicating a potentially important role for CDOM in ocean-climate connections through its impact on photochemistry and photobiology.

  19. 76 FR 32231 - International Business Machines (IBM), Sales and Distribution Business Unit, Global Sales...

    Science.gov (United States)

    2011-06-03

    ... for the workers and former workers of International Business Machines (IBM), Sales and Distribution... reconsideration alleges that IBM outsourced to India and China. During the reconsideration investigation, it was..., Armonk, New York. The subject worker group supply computer software development and maintenance services...

  20. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    Science.gov (United States)

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  1. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    Science.gov (United States)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  2. Modeling the UT effect in global distribution of ionospheric electric fields

    DEFF Research Database (Denmark)

    Lukianova, R.; Christiansen, Freddy

    2008-01-01

    A new approach for modeling the global distribution of ionospheric electric potentials utilizing high-precision maps of field-aligned currents (FACs) derived from measurements by the Orsted and Magsat satellites as input to a comprehensive numerical scheme is presented. We simulate the universal ...

  3. Global Crisis as Enterprise Software Motivator: from Lifecycle Optimization to Efficient Implementation Series

    Directory of Open Access Journals (Sweden)

    Sergey V. Zykov

    2012-04-01

    Full Text Available It is generally known that software system development lifecycle (SSDL should be managed adequately. The global economy crisis and subsequent depression have taught us certain lessons on the subject, which is so vital for enterprises. The paper presents the adaptive methodology of enterprise SSDL, which allows to avoid "local crises" while producing large-scale software. The methodology is based on extracting common ERP module level patterns and applying them to series of heterogeneous implementations. The approach includes a lifecycle model, which extends conventional spiral model by formal data representation/management models and DSL-based "low-level" CASE tools supporting the formalisms. The methodology has been successfully implemented as a series of portal-based ERP systems in ITERA oil-and-gas corporation, and in a number of trading/banking enterprise applications for other enterprises. Semantic network-based airline dispatch system, and a 6D-model-driven nuclear power plant construction support system are currently in progress. Combining various SSDL models is discussed. Terms-and-cost reduction factors are examined. Correcting SSDL according to project size and scope is overviewed. The so-called “human factor errors” resulting from non-systematic SSDL approach, and their influencing crisis and depression, are analyzed. The ways to systematic and efficient SSDL are outlined. Troubleshooting advises are given for the problems concerned.

  4. Generation of real-time global ionospheric map based on the global GNSS stations with only a sparse distribution

    Science.gov (United States)

    Li, Zishen; Wang, Ningbo; Li, Min; Zhou, Kai; Yuan, Yunbin; Yuan, Hong

    2017-04-01

    The Earth's ionosphere is part of the atmosphere stretching from an altitude of about 50 km to more than 1000 km. When the Global Navigation Satellite System (GNSS) signal emitted from a satellite travels through the ionosphere before reaches a receiver on or near the Earth surface, the GNSS signal is significantly delayed by the ionosphere and this delay bas been considered as one of the major errors in the GNSS measurement. The real-time global ionospheric map calculated from the real-time data obtained by global stations is an essential method for mitigating the ionospheric delay for real-time positioning. The generation of an accurate global ionospheric map generally depends on the global stations with dense distribution; however, the number of global stations that can produce the real-time data is very limited at present, which results that the generation of global ionospheric map with a high accuracy is very different when only using the current stations with real-time data. In view of this, a new approach is proposed for calculating the real-time global ionospheric map only based on the current stations with real-time data. This new approach is developed on the basis of the post-processing and the one-day predicted global ionospheric map from our research group. The performance of the proposed approach is tested by the current global stations with the real-time data and the test results are also compared with the IGS-released final global ionospheric map products.

  5. Software Design of SMD LEDs for Homogeneous Distribution of Irradiation in the Model of Dark Room

    Directory of Open Access Journals (Sweden)

    Andrej Liner

    2014-01-01

    Full Text Available This article describes wireless optical data networks using visible spectra of optical radiation with a focus on interior areas with direct line of sight LOS (line-of-sight. This type of network represents progressively evolving area of information technologies. Development of lightning technologies based on white power LED was the impulse for wireless optical data networks based on visible spectra of optical radiation (VLC development. Its basic advantage is the flexibility of users. Users don’t have to stay on one place during the data sharing anymore. Wireless optical data networks represent an alternative solution for metallic and fiber networks [1], [2]. This paper deals with the software simulation of homogeneous distribution of optical irradiation in dark room model, carrying out in LightTools software. First, in previous simulations, the optical source composed from 9 SMD LED’s type LW G6SP-EAFA-JKQL-1 was designed. In various simulations, various numbers and distributions of LED’s were used. These were placed at the ceiling of the dark room. At last, the results of optical irradiation homogeneity are compared.

  6. Distribution of known macrozooplankton abundance and biomass in the global ocean

    Science.gov (United States)

    Moriarty, R.; Buitenhuis, E. T.; Le Quéré, C.; Gosselin, M.-P.

    2013-07-01

    Macrozooplankton are an important link between higher and lower trophic levels in the oceans. They serve as the primary food for fish, reptiles, birds and mammals in some regions, and play a role in the export of carbon from the surface to the intermediate and deep ocean. Little, however, is known of their global distribution and biomass. Here we compiled a dataset of macrozooplankton abundance and biomass observations for the global ocean from a collection of four datasets. We harmonise the data to common units, calculate additional carbon biomass where possible, and bin the dataset in a global 1 × 1 degree grid. This dataset is part of a wider effort to provide a global picture of carbon biomass data for key plankton functional types, in particular to support the development of marine ecosystem models. Over 387 700 abundance data and 1330 carbon biomass data have been collected from pre-existing datasets. A further 34 938 abundance data were converted to carbon biomass data using species-specific length frequencies or using species-specific abundance to carbon biomass data. Depth-integrated values are used to calculate known epipelagic macrozooplankton biomass concentrations and global biomass. Global macrozooplankton biomass, to a depth of 350 m, has a mean of 8.4 μg C L-1, median of 0.2 μg C L-1 and a standard deviation of 63.5 μg C L-1. The global annual average estimate of macrozooplankton biomass in the top 350 m, based on the median value, is 0.02 Pg C. There are, however, limitations on the dataset; abundance observations have good coverage except in the South Pacific mid-latitudes, but biomass observation coverage is only good at high latitudes. Biomass is restricted to data that is originally given in carbon or to data that can be converted from abundance to carbon. Carbon conversions from abundance are restricted by the lack of information on the size of the organism and/or the absence of taxonomic information. Distribution patterns of global

  7. Distributed Supervisory Protection Interlock System

    International Nuclear Information System (INIS)

    Walz, H.V.; Agostini, R.C.; Barker, L.; Cherkassky, R.; Constant, T.; Matheson, R.

    1989-03-01

    The Distributed Supervisory Protection Interlock System, DSPI, is under development at the Stanford Linear Accelerator Center for requirements in the areas of personnel protection, beam containment and equipment protection interlocks. The DSPI system, distributed over the application site, consists of segments with microprocessor-based controller and I/O modules, local area networks for communication, and a global supervisor computer. Segments are implemented with commercially available controller and I/O modules arranged in local interlock clusters, and associated software. Segments provide local interlock data acquisition, processing and control. Local area networks provide the communication backbone between segments and a global supervisor processor. The supervisor processor monitors the overall system, reports detail status and provides human interfaces. Details of an R and D test system, which will implement the requirements for personnel protection of 4 typical linear accelerator sectors, will be described. 4 refs., 2 figs

  8. The equipment access software for a distributed UNIX-based accelerator control system

    International Nuclear Information System (INIS)

    Trofimov, Nikolai; Zelepoukine, Serguei; Zharkov, Eugeny; Charrue, Pierre; Gareyte, Claire; Poirier, Herve

    1994-01-01

    This paper presents a generic equipment access software package for a distributed control system using computers with UNIX or UNIX-like operating systems. The package consists of three main components, an application Equipment Access Library, Message Handler and Equipment Data Base. An application task, which may run in any computer in the network, sends requests to access equipment through Equipment Library calls. The basic request is in the form Equipment-Action-Data and is routed via a remote procedure call to the computer to which the given equipment is connected. In this computer the request is received by the Message Handler. According to the type of the equipment connection, the Message Handler either passes the request to the specific process software in the same computer or forwards it to a lower level network of equipment controllers using MIL1553B, GPIB, RS232 or BITBUS communication. The answer is then returned to the calling application. Descriptive information required for request routing and processing is stored in the real-time Equipment Data Base. The package has been written to be portable and is currently available on DEC Ultrix, LynxOS, HPUX, XENIX, OS-9 and Apollo domain. ((orig.))

  9. Mycobacterium tuberculosis Lineage 4 comprises globally distributed and geographically restricted sublineages

    Science.gov (United States)

    Coscolla, Mireia; Liu, Qingyun; Trauner, Andrej; Fenner, Lukas; Rutaihwa, Liliana; Borrell, Sonia; Luo, Tao; Gao, Qian; Kato-Maeda, Midori; Ballif, Marie; Egger, Matthias; Macedo, Rita; Mardassi, Helmi; Moreno, Milagros; Tudo Vilanova, Griselda; Fyfe, Janet; Globan, Maria; Thomas, Jackson; Jamieson, Frances; Guthrie, Jennifer L.; Asante-Poku, Adwoa; Yeboah-Manu, Dorothy; Wampande, Eddie; Ssengooba, Willy; Joloba, Moses; Henry Boom, W.; Basu, Indira; Bower, James; Saraiva, Margarida; Vaconcellos, Sidra E. G.; Suffys, Philip; Koch, Anastasia; Wilkinson, Robert; Gail-Bekker, Linda; Malla, Bijaya; Ley, Serej D.; Beck, Hans-Peter; de Jong, Bouke C.; Toit, Kadri; Sanchez-Padilla, Elisabeth; Bonnet, Maryline; Gil-Brusola, Ana; Frank, Matthias; Penlap Beng, Veronique N.; Eisenach, Kathleen; Alani, Issam; Wangui Ndung’u, Perpetual; Revathi, Gunturu; Gehre, Florian; Akter, Suriya; Ntoumi, Francine; Stewart-Isherwood, Lynsey; Ntinginya, Nyanda E.; Rachow, Andrea; Hoelscher, Michael; Cirillo, Daniela Maria; Skenders, Girts; Hoffner, Sven; Bakonyte, Daiva; Stakenas, Petras; Diel, Roland; Crudu, Valeriu; Moldovan, Olga; Al-Hajoj, Sahal; Otero, Larissa; Barletta, Francesca; Jane Carter, E.; Diero, Lameck; Supply, Philip; Comas, Iñaki; Niemann, Stefan; Gagneux, Sebastien

    2016-01-01

    Generalist and specialist species differ in the breadth of their ecological niche. Little is known about the niche width of obligate human pathogens. Here we analyzed a global collection of Mycobacterium tuberculosis Lineage 4 clinical isolates, the most geographically widespread cause of human tuberculosis. We show that Lineage 4 comprises globally distributed and geographically restricted sublineages, suggesting a distinction between generalists and specialists. Population genomic analyses showed that while the majority of human T cell epitopes were conserved in all sublineages, the proportion of variable epitopes was higher in generalists. Our data further support a European origin for the most common generalist sublineage. Hence, the global success of Lineage 4 reflects distinct strategies adopted by different sublineages and the influence of human migration. PMID:27798628

  10. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  11. Fast BPM data distribution for global orbit feedback using commercial gigabit ethernet technology

    International Nuclear Information System (INIS)

    Hulsart, R.; Cerniglia, P.; Michnoff, R.; Minty, M.

    2011-01-01

    In order to correct beam perturbations in RHIC around 10Hz, a new fast data distribution network was required to deliver BPM position data at rates several orders of magnitude above the capability of the existing system. The urgency of the project limited the amount of custom hardware that could be developed, which dictated the use of as much commercially available equipment as possible. The selected architecture uses a custom hardware interface to the existing RHIC BPM electronics together with commercially available Gigabit Ethernet switches to distribute position data to devices located around the collider ring. Using the minimum Ethernet packet size and a field programmable gate array (FPGA) based state machine logic instead of a software based driver, real-time and deterministic data delivery is possible using Ethernet. The method of adapting this protocol for low latency data delivery, bench testing of Ethernet hardware, and the logic to construct Ethernet packets using FPGA hardware will be discussed. A robust communications system using almost all commercial off-the-shelf equipment was developed in under a year which enabled retrofitting of the existing RHIC BPM system to provide 10 KHz data delivery for a global orbit feedback scheme using 72 BPMs. Total latencies from data acquisition at the BPMs to delivery at the controller modules, including very long transmission distances, were kept under 100 (micro)s, which provide very little phase error in correcting the 10 Hz oscillations. Leveraging off of the speed of Gigabit Ethernet and wide availability of Ethernet products enabled this solution to be fully implemented in a much shorter time and at lower cost than if a similar network was developed using a proprietary method.

  12. The distributed development environment for SDSS software

    International Nuclear Information System (INIS)

    Berman, E.; Gurbani, V.; Mackinnon, B.; Newberg, H. Nicinski, T.; Petravick, D.; Pordes, R.; Sergey, G.; Stoughton, C.; Lupton, R.

    1994-04-01

    The authors present an integrated science software development environment, code maintenance and support system for the Sloan Digital Sky Survey (SDSS) now being actively used throughout the collaboration

  13. The FRISBEE tool, a software for optimising the trade-off between food quality, energy use, and global warming impact of cold chains

    NARCIS (Netherlands)

    Gwanpua, S.G.; Verboven, P.; Leducq, D.; Brown, T.; Verlinden, B.E.; Bekele, E.; Aregawi, W. Evans, J.; Foster, A.; Duret, S.; Hoang, H.M.; Sluis, S. van der; Wissink, E.; Hendriksen, L.J.A.M.; Taoukis, P.; Gogou, E.; Stahl, V.; El Jabri, M.; Le Page, J.F.; Claussen, I.; Indergård, E.; Nicolai, B.M.; Alvarez, G.; Geeraerd, A.H.

    2015-01-01

    Food quality (including safety) along the cold chain, energy use and global warming impact of refrigeration systems are three key aspects in assessing cold chain sustainability. In this paper, we present the framework of a dedicated software, the FRISBEE tool, for optimising quality of refrigerated

  14. Distribution of mesozooplankton biomass in the global ocean

    Directory of Open Access Journals (Sweden)

    R. Moriarty

    2013-02-01

    Full Text Available Mesozooplankton are cosmopolitan within the sunlit layers of the global ocean. They are important in the pelagic food web, having a significant feedback to primary production through their consumption of phytoplankton and microzooplankton. In many regions of the global ocean, they are also the primary contributors to vertical particle flux in the oceans. Through both they affect the biogeochemical cycling of carbon and other nutrients in the oceans. Little, however, is known about their global distribution and biomass. While global maps of mesozooplankton biomass do exist in the literature, they are usually in the form of hand-drawn maps for which the original data associated with these maps are not readily available. The dataset presented in this synthesis has been in development since the late 1990s, is an integral part of the Coastal and Oceanic Plankton Ecology, Production, and Observation Database (COPEPOD, and is now also part of a wider community effort to provide a global picture of carbon biomass data for key plankton functional types, in particular to support the development of marine ecosystem models. A total of 153 163 biomass values were collected, from a variety of sources, for mesozooplankton. Of those 2% were originally recorded as dry mass, 26% as wet mass, 5% as settled volume, and 68% as displacement volume. Using a variety of non-linear biomass conversions from the literature, the data have been converted from their original units to carbon biomass. Depth-integrated values were then used to calculate an estimate of mesozooplankton global biomass. Global epipelagic mesozooplankton biomass, to a depth of 200 m, had a mean of 5.9 μg C L−1, median of 2.7 μg C L−1 and a standard deviation of 10.6 μg C L−1. The global annual average estimate of mesozooplankton in the top 200 m, based on the median value, was 0.19 Pg C. Biomass was highest in the Northern Hemisphere, and there were slight decreases from polar oceans (40

  15. Knowledge-Based Software Management

    International Nuclear Information System (INIS)

    Sally Schaffner; Matthew Bickley; Brian Bevins; Leon Clancy; Karen White

    2003-01-01

    Management of software in a dynamic environment such as is found at Jefferson Lab can be a daunting task. Software development tasks are distributed over a wide range of people with varying skill levels. The machine configuration is constantly changing requiring upgrades to software at both the hardware control level and the operator control level. In order to obtain high quality support from vendor service agreements, which is vital to maintaining 24/7 operations, hardware and software must be kept at industry's current levels. This means that periodic upgrades independent of machine configuration changes must take place. It is often difficult to identify and organize the information needed to guide the process of development, upgrades and enhancements. Dependencies between support software and applications need to be consistently identified to prevent introducing errors during upgrades and to allow adequate testing to be planned and performed. Developers also need access to information regarding compilers, make files and organized distribution directories. This paper describes a system under development at Jefferson Lab which will provide software developers and managers this type of information in a timely user-friendly fashion. The current status and future plans for the system will be detailed

  16. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM)

    OpenAIRE

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tama...

  17. Globalization impact on consumption and distribution in society

    Directory of Open Access Journals (Sweden)

    Panfilova Olga

    2018-01-01

    Full Text Available One of the greatest threats to the capital reproduction globalization is the significant increasing of unemployment. Outsourcing creates workplaces outside the framework of TNCs, but in frames of the national economy. From our point of view this is a way to maximize the financial result. Unemployment, even technical, increases social spending. At the same time, the idea of basic income should change the priorities in the system of distribution of the reproduction result. The key beneficiary of sharing-economy approach is a consumer, who can save something on intermediaries or transaction costs. The development of sharing economy rather indicates that TNCs are not planning to lower their rate of return. However, the growing technological unemployment simultaneously with the stimulation of demand found a solution. The solution is the joint consumption of goods produced by the sharing economy. Historical examples of solution to the problem of distribution clearly demonstrate the importance of transforming the reproductive cycle’s financial architecture. The models operating on the basis of fair distribution have not only prospects of development, but also are capable to change society attitude to the concept and approaches to consumption.

  18. System support software for TSTA

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-01-01

    The software at the Tritium Systems Test Assembly (TSTA) is logically broken into two parts, the system support software and the subsystem software. The purpose of the system support software is to isolate the subsystem software from the physical hardware. In this sense the system support software forms the kernel of the software at TSTA. The kernel software performs several functions. It gathers data from CAMAC modules and makes that data available for subsystem processes. It services requests to send commands to CAMAC modules. It provides a system of logging functions and provides for a system-wide global program state that allows highly structured interaction between subsystem processes. The kernel's most visible function is to provide the Man-Machine Interface (MMI). The MMI allows the operators a window into the physical hardware and subsystem process state. Finally the kernel provides a data archiving and compression function that allows archival data to be accessed and plotted. Such kernel software as developed and implemented at TSTA is described

  19. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  20. NOx from lightning: 1. Global distribution based on lightning physics

    Science.gov (United States)

    Price, Colin; Penner, Joyce; Prather, Michael

    1997-03-01

    This paper begins a study on the role of lightning in maintaining the global distribution of nitrogen oxides (NOx) in the troposphere. It presents the first global and seasonal distributions of lightning-produced NOx (LNOx) based on the observed distribution of electrical storms and the physical properties of lightning strokes. We derive a global rate for cloud-to-ground (CG) flashes of 20-30 flashes/s with a mean energy per flash of 6.7×109 J. Intracloud (IC) flashes are more frequent, 50-70 flashes/s but have 10% of the energy of CG strokes and, consequently, produce significantly less NOx. It appears to us that the majority of previous studies have mistakenly assumed that all lightning flashes produce the same amount of NOx, thus overestimating the NOx production by a factor of 3. On the other hand, we feel these same studies have underestimated the energy released in CG flashes, resulting in two negating assumptions. For CG energies we adopt a production rate of 10×1016 molecules NO/J based on the current literature. Using a method to simulate global lightning frequencies from satellite-observed cloud data, we have calculated the LNOx on various spatial (regional, zonal, meridional, and global) and temporal scales (daily, monthly, seasonal, and interannual). Regionally, the production of LNOx is concentrated over tropical continental regions, predominantly in the summer hemisphere. The annual mean production rate is calculated to be 12.2 Tg N/yr, and we believe it extremely unlikely that this number is less than 5 or more than 20 Tg N/yr. Although most of LNOx, is produced in the lowest 5 km by CG lightning, convective mixing in the thunderstorms is likely to deposit large amounts of NOx, in the upper troposphere where it is important in ozone production. On an annual basis, 64% of the LNOx, is produced in the northern hemisphere, implying that the northern hemisphere should have natural ozone levels as much as 2 times greater than the southern hemisphere

  1. Innovation in globally distributed teams: the role of LMX, communication frequency, and member influence on team decisions.

    Science.gov (United States)

    Gajendran, Ravi S; Joshi, Aparna

    2012-11-01

    For globally distributed teams charged with innovation, member contributions to the team are crucial for effective performance. Prior research, however, suggests that members of globally distributed teams often feel isolated and excluded from their team's activities and decisions. How can leaders of such teams foster member inclusion in team decisions? Drawing on leader-member exchange (LMX) theory, we propose that for distributed teams, LMX and communication frequency jointly shape member influence on team decisions. Findings from a test of our hypotheses using data from 40 globally distributed teams suggest that LMX can enhance member influence on team decisions when it is sustained through frequent leader-member communication. This joint effect is strengthened as team dispersion increases. At the team level, member influence on team decisions has a positive effect on team innovation. (c) 2012 APA, all rights reserved.

  2. Global distribution of mean age of stratospheric air from MIPAS SF6 measurements

    Directory of Open Access Journals (Sweden)

    H. Fischer

    2008-02-01

    Full Text Available Global distributions of profiles of sulphur hexafluoride (SF6 have been retrieved from limb emission spectra recorded by the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS on Envisat covering the period September 2002 to March 2004. Individual SF6 profiles have a precision of 0.5 pptv below 25 km altitude and a vertical resolution of 4–6 km up to 35 km altitude. These data have been validated versus in situ observations obtained during balloon flights of a cryogenic whole-air sampler. For the tropical troposphere a trend of 0.230±0.008 pptv/yr has been derived from the MIPAS data, which is in excellent agreement with the trend from ground-based flask and in situ measurements from the National Oceanic and Atmospheric Administration Earth System Research Laboratory, Global Monitoring Division. For the data set currently available, based on at least three days of data per month, monthly 5° latitude mean values have a 1σ standard error of 1%. From the global SF6 distributions, global daily and monthly distributions of the apparent mean age of air are inferred by application of the tropical tropospheric trend derived from MIPAS data. The inferred mean ages are provided for the full globe up to 90° N/S, and have a 1σ standard error of 0.25 yr. They range between 0 (near the tropical tropopause and 7 years (except for situations of mesospheric intrusions and agree well with earlier observations. The seasonal variation of the mean age of stratospheric air indicates episodes of severe intrusion of mesospheric air during each Northern and Southern polar winter observed, long-lasting remnants of old, subsided polar winter air over the spring and summer poles, and a rather short period of mixing with midlatitude air and/or upward transport during fall in October/November (NH and April/May (SH, respectively, with small latitudinal gradients, immediately before the new polar vortex starts to form. The mean age distributions further

  3. Distribution and habitat of brazilian-pine according to global climate change

    Directory of Open Access Journals (Sweden)

    Marcos Silveira Wrege

    2017-09-01

    Full Text Available Araucaria angustifolia (Bertol. O. Kuntze., also known as brazilian-pine, is a forest native species from Brazil. A. angustifolia is more vulnerable to global climate change, considering it is living in cold and humid mountain regions from southern and southeastern Brazil. Among the native Brazilian forest species, it presents one of the greatest growth and genetic gain potential. It shows excellent wood quality and can still be used in human and animal food, presenting great economic, social and environmental value. In order to determine current distribution of the species and better know its habitat, we worked in the regions representing the borders of natural occurrence, identifying populations and getting trees altitude and geographycal position. Field information along with secondary data from the Environmental Information Center (CRIA were used to map current distribution of brazilian-pine and to project the distribution in the next decades, with the projection of future climate scenarios. Mapping studies of ecological niches in present and future climate scenarios characterizing the environments in which they are living is essential for a better understanding of the risks of species extinction and which mitigating measures could be adequate to reduce the impacts of global climate change on species, thus contributing to the conservation and knowledge of this important species.

  4. Global patterns of city size distributions and their fundamental drivers.

    Directory of Open Access Journals (Sweden)

    Ethan H Decker

    2007-09-01

    Full Text Available Urban areas and their voracious appetites are increasingly dominating the flows of energy and materials around the globe. Understanding the size distribution and dynamics of urban areas is vital if we are to manage their growth and mitigate their negative impacts on global ecosystems. For over 50 years, city size distributions have been assumed to universally follow a power function, and many theories have been put forth to explain what has become known as Zipf's law (the instance where the exponent of the power function equals unity. Most previous studies, however, only include the largest cities that comprise the tail of the distribution. Here we show that national, regional and continental city size distributions, whether based on census data or inferred from cluster areas of remotely-sensed nighttime lights, are in fact lognormally distributed through the majority of cities and only approach power functions for the largest cities in the distribution tails. To explore generating processes, we use a simple model incorporating only two basic human dynamics, migration and reproduction, that nonetheless generates distributions very similar to those found empirically. Our results suggest that macroscopic patterns of human settlements may be far more constrained by fundamental ecological principles than more fine-scale socioeconomic factors.

  5. Software packages for food engineering needs

    OpenAIRE

    Abakarov, Alik

    2011-01-01

    The graphic user interface (GUI) software packages “ANNEKs” and “OPT-PROx” are developed to meet food engineering needs. “OPT-RROx” (OPTimal PROfile) is software developed to carry out thermal food processing optimization based on the variable retort temperature processing and global optimization technique. “ANNEKs” (Artificial Neural Network Enzyme Kinetics) is software designed for determining the kinetics of enzyme hydrolysis of protein at different initial reaction parameters based on the...

  6. Maintenance simulation: Software issues

    Energy Technology Data Exchange (ETDEWEB)

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  7. IT Project Management in Very Small Software Companies

    DEFF Research Database (Denmark)

    Shakir, Shahid Nadeem; Nørbjerg, Jacob

    2013-01-01

    In developing countries very small software companies (VSSCs) with only 1-10 employees play an important role both in the local economy and as providers of software and services to customers in other parts of the world. Understanding and improving their IT project management (ITPM) practices...... and challenges are, therefore, important in the local as well as the larger context of globalized software development. There is, however, very little research into small shop software practices in developing countries. The current paper explores actual ITPM practices in Pakistani VSSCs based on a qualitative...... study of seven Pakistani VSSCs. We find that some Pakistani ITPM practices are similar to what is reported from VSSCs in other parts of the world, while others seem to be related to the companies' position in the global software development chain. This paper is part of a larger research project aiming...

  8. Free Software and Free Textbooks

    Science.gov (United States)

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  9. Concrete containment integrity software: Procedure manual and guidelines

    International Nuclear Information System (INIS)

    Dameron, R.A.; Dunham, R.S.; Rashid, Y.R.

    1990-06-01

    This report is an executive summary describing the concrete containment analysis methodology and software that was developed in the EPRI-sponsored research to predict the overpressure behavior and leakage of concrete containments. A set of guidelines has been developed for performing reliable 2D axisymmetric concrete containment analysis with a cracking concrete constitutive model developed by ANATECH. The software package developed during this research phase is designed for use in conjunction with ABAQUS-EPGEN; it provides the concrete model and automates axisymmetric grid preparation, and rebar generation for 2D and 3D grids. The software offers the option of generating pre-programmed axisymmetric grids that can be tailored to a specific containment by input of a few geometry parameters. The goal of simplified axisymmetric analysis within the framework of the containment leakage prediction methodology is to compute global liner strain histories at various locations within the containment. A simplified approach for generating peak liner strains at structural discontinuities as function of the global liner strains has been presented in a separate leakage criteria document; the curves for strain magnification factors and liner stress triaxiality factors found in that document are intended to be applied to the global liner strain histories developed through global 2D analysis. This report summarizes the procedures for global 2D analysis and gives an overview of the constitutive model and the special purpose concrete containment analysis software developed in this research phase. 8 refs., 10 figs

  10. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  11. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  12. GlobAl Distribution of GEnetic Traits (GADGET) web server: polygenic trait scores worldwide.

    Science.gov (United States)

    Chande, Aroon T; Wang, Lu; Rishishwar, Lavanya; Conley, Andrew B; Norris, Emily T; Valderrama-Aguirre, Augusto; Jordan, I King

    2018-05-18

    Human populations from around the world show striking phenotypic variation across a wide variety of traits. Genome-wide association studies (GWAS) are used to uncover genetic variants that influence the expression of heritable human traits; accordingly, population-specific distributions of GWAS-implicated variants may shed light on the genetic basis of human phenotypic diversity. With this in mind, we developed the GlobAl Distribution of GEnetic Traits web server (GADGET http://gadget.biosci.gatech.edu). The GADGET web server provides users with a dynamic visual platform for exploring the relationship between worldwide genetic diversity and the genetic architecture underlying numerous human phenotypes. GADGET integrates trait-implicated single nucleotide polymorphisms (SNPs) from GWAS, with population genetic data from the 1000 Genomes Project, to calculate genome-wide polygenic trait scores (PTS) for 818 phenotypes in 2504 individual genomes. Population-specific distributions of PTS are shown for 26 human populations across 5 continental population groups, with traits ordered based on the extent of variation observed among populations. Users of GADGET can also upload custom trait SNP sets to visualize global PTS distributions for their own traits of interest.

  13. Cross-Cultural Management Learning through Innovative Pedagogy: An Exploratory Study of Globally Distributed Student Teams

    Science.gov (United States)

    Bartel-Radic, Anne; Moos, J. Chris; Long, Suzanna K.

    2015-01-01

    This article presents an innovative pedagogy based on student participation in globally distributed project teams. The study questions the link between student learning of intercultural competence and the global teaming experience. Data was collected from 115 students participating in 22 virtual intercultural teams. Results revealed that students…

  14. Relation work in collocated and distributed collaboration

    DEFF Research Database (Denmark)

    Christensen, Lars Rune; Jensen, Rasmus Eskild; Bjørn, Pernille

    2014-01-01

    Creating social ties are important for collaborative work; however, in geographically distributed organizations e.g. global software development, making social ties requires extra work: Relation work. We find that characteristics of relation work as based upon shared history and experiences......, emergent in personal and often humorous situations. Relation work is intertwined with other activities such as articulation work and it is rhythmic by following the work patterns of the participants. By comparing how relation work is conducted in collocated and geographically distributed settings we...... in this paper identify basic differences in relation work. Whereas collocated relation work is spontaneous, place-centric, and yet mobile, relation work in a distributed setting is semi-spontaneous, technology-mediated, and requires extra efforts....

  15. When to make proprietary software open source

    NARCIS (Netherlands)

    Caulkins, J.P.; Feichtinger, G.; Grass, D.; Hartl, R.F.; Kort, P.M.; Seidl, A.

    Software can be distributed closed source (proprietary) or open source (developed collaboratively). While a firm cannot sell open source software, and so loses potential sales revenue, the open source software development process can have a substantial positive impact on the quality of a software,

  16. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  17. Trust in Co-sourced Software Development

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2014-01-01

    Software development projects are increasingly geographical distributed with offshoring. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how dynamic aspects of trust are shaped in co-sourcing activities is limite...... understanding or personal trust relations. The paper suggests how certain work practices among developers and managers can be explained using a dynamic trust lens based on Abstract Systems, especially dis- and re-embedding mechanisms......Software development projects are increasingly geographical distributed with offshoring. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how dynamic aspects of trust are shaped in co-sourcing activities is limited...

  18. Remotely Sensed High-Resolution Global Cloud Dynamics for Predicting Ecosystem and Biodiversity Distributions.

    Directory of Open Access Journals (Sweden)

    Adam M Wilson

    2016-03-01

    Full Text Available Cloud cover can influence numerous important ecological processes, including reproduction, growth, survival, and behavior, yet our assessment of its importance at the appropriate spatial scales has remained remarkably limited. If captured over a large extent yet at sufficiently fine spatial grain, cloud cover dynamics may provide key information for delineating a variety of habitat types and predicting species distributions. Here, we develop new near-global, fine-grain (≈1 km monthly cloud frequencies from 15 y of twice-daily Moderate Resolution Imaging Spectroradiometer (MODIS satellite images that expose spatiotemporal cloud cover dynamics of previously undocumented global complexity. We demonstrate that cloud cover varies strongly in its geographic heterogeneity and that the direct, observation-based nature of cloud-derived metrics can improve predictions of habitats, ecosystem, and species distributions with reduced spatial autocorrelation compared to commonly used interpolated climate data. These findings support the fundamental role of remote sensing as an effective lens through which to understand and globally monitor the fine-grain spatial variability of key biodiversity and ecosystem properties.

  19. On the spatial and temporal distribution of global thunderstorm cells

    International Nuclear Information System (INIS)

    Mezuman, Keren; Price, Colin; Galanti, Eli

    2014-01-01

    Estimates of global thunderstorm activity have been made predominately by direct measurements of lightning discharges around the globe, either by optical measurements from satellites, or using ground-based radio antennas. In this paper we propose a new methodology in which thunderstorm clusters are constructed based on the lightning strokes detected by the World Wide Lightning Location Network (WWLLN) in the very low frequency range. We find that even with low lightning detection efficiency on a global scale, the spatial and temporal distribution of global thunderstorm cells is well reproduced. This is validated by comparing the global diurnal variations of the thunderstorm cells, and the currents produced by these storms, with the well-known Carnegie Curve, which represents the mean diurnal variability of the global atmospheric electric circuit, driven by thunderstorm activity. While the Carnegie Curve agrees well with our diurnal thunderstorm cluster variations, there is little agreement between the Carnegie Curve and the diurnal variation in the number of lightning strokes detected by the WWLLN. When multiplying the number of clusters we detect by the mean thunderstorm conduction current for land and ocean thunderstorms (Mach et al 2011 J. Geophys. Res. 116 D05201) we get a total average current of about 760 A. Our results show that thunderstorms alone explain more than 90% in the variability of the global electric circuit. However, while it has been previously shown that 90% of the global lightning occurs over continental landmasses, we show that around 50% of the thunderstorms are over the oceans, and from 00-09UTC there are more thunderstorm cells globally over the oceans than over the continents. Since the detection efficiency of the WWLLN system has increased over time, we estimate that the lower bound of the mean number of global thunderstorm cells in 2012 was around 1050 per hour, varying from around 840 at 03UTC to 1150 storms at 19UTC. (letter)

  20. Response of the mean global vegetation distribution to interannual climate variability

    Energy Technology Data Exchange (ETDEWEB)

    Notaro, Michael [University of Wisconsin-Madison, Center for Climatic Research, Madison, WI (United States)

    2008-06-15

    The impact of interannual variability in temperature and precipitation on global terrestrial ecosystems is investigated using a dynamic global vegetation model driven by gridded climate observations for the twentieth century. Contrasting simulations are driven either by repeated mean climatology or raw climate data with interannual variability included. Interannual climate variability reduces net global vegetation cover, particularly over semi-arid regions, and favors the expansion of grass cover at the expense of tree cover, due to differences in growth rates, fire impacts, and interception. The area burnt by global fires is substantially enhanced by interannual precipitation variability. The current position of the central United States' ecotone, with forests to the east and grasslands to the west, is largely attributed to climate variability. Among woody vegetation, climate variability supports expanded deciduous forest growth and diminished evergreen forest growth, due to difference in bioclimatic limits, leaf longevity, interception rates, and rooting depth. These results offer insight into future ecosystem distributions since climate models generally predict an increase in climate variability and extremes. (orig.)

  1. Software for people fundamentals, trends and best practices

    CERN Document Server

    Maedche, Alexander; Neer, Ludwig

    2012-01-01

    The highly competitive and globalized software market is creating pressure on software companies. Given the current boundary conditions, it is critical to continuously increase time-to-market and reduce development costs. In parallel, driven by private life experiences with mobile computing devices, the World Wide Web and software-based services, people, general expectations with regards to software are growing. They expect software that is simple and joyful to use. In the light of the changes that have taken place in recent years, software companies need to fundamentally reconsider the way th

  2. Climate Impacts of CALIPSO-Guided Corrections to Black Carbon Aerosol Vertical Distributions in a Global Climate Model

    International Nuclear Information System (INIS)

    Kovilakam, Mahesh; Mahajan, Salil; Saravanan, R.; Chang, Ping

    2017-01-01

    Here, we alleviate the bias in the tropospheric vertical distribution of black carbon aerosols (BC) in the Community Atmosphere Model (CAM4) using the Cloud-Aerosol and Infrared Pathfinder Satellite Observations (CALIPSO)-derived vertical profiles. A suite of sensitivity experiments are conducted with 1x, 5x, and 10x the present-day model estimated BC concentration climatology, with (corrected, CC) and without (uncorrected, UC) CALIPSO-corrected BC vertical distribution. The globally averaged top of the atmosphere radiative flux perturbation of CC experiments is ~8–50% smaller compared to uncorrected (UC) BC experiments largely due to an increase in low-level clouds. The global average surface temperature increases, the global average precipitation decreases, and the ITCZ moves northward with the increase in BC radiative forcing, irrespective of the vertical distribution of BC. Further, tropical expansion metrics for the poleward extent of the Northern Hemisphere Hadley cell (HC) indicate that simulated HC expansion is not sensitive to existing model biases in BC vertical distribution.

  3. Scalable and fail-safe deployment of the ATLAS Distributed Data Management system Rucio

    Science.gov (United States)

    Lassnig, M.; Vigne, R.; Beermann, T.; Barisits, M.; Garonne, V.; Serfon, C.

    2015-12-01

    This contribution details the deployment of Rucio, the ATLAS Distributed Data Management system. The main complication is that Rucio interacts with a wide variety of external services, and connects globally distributed data centres under different technological and administrative control, at an unprecedented data volume. It is therefore not possible to create a duplicate instance of Rucio for testing or integration. Every software upgrade or configuration change is thus potentially disruptive and requires fail-safe software and automatic error recovery. Rucio uses a three-layer scaling and mitigation strategy based on quasi-realtime monitoring. This strategy mainly employs independent stateless services, automatic failover, and service migration. The technologies used for deployment and mitigation include OpenStack, Puppet, Graphite, HAProxy and Apache. In this contribution, the interplay between these components, their deployment, software mitigation, and the monitoring strategy are discussed.

  4. Scalable and fail-safe deployment of the ATLAS Distributed Data Management system Rucio

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Beermann, Thomas Alfons; Barisits, Martin-Stefan; Garonne, Vincent; Serfon, Cedric

    2015-01-01

    This contribution details the deployment of Rucio, the ATLAS Distributed Data Management system. The main complication is that Rucio interacts with a wide variety of external services, and connects globally distributed data centres under different technological and administrative control, at an unprecedented data volume. It is therefore not possibly to create a duplicate instance of Rucio for testing or integration. Every software upgrade or configuration change is thus potentially disruptive and requires fail-safe software and automatic error recovery. Rucio uses a three-layer scaling and mitigation strategy based on quasi-realtime monitoring. This strategy mainly employs independent stateless services, automatic failover, and service migration. The technologies used for deployment and mitigation include OpenStack, Puppet, Graphite, HAProxy and Apache. In this contribution, the interplay between these component, their deployment, software mitigation, and the monitoring strategy are discussed.

  5. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  6. Next generation software process improvement

    OpenAIRE

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  7. Forest Distribution on Small Isolated Hills and Implications on Woody Plant Distribution under Threats of Global Warming

    Directory of Open Access Journals (Sweden)

    Chi-Cheng Liao

    2012-09-01

    Full Text Available Treelines have been found to be lower in small isolated hilltops, but the specific dynamics behind this unique phenomenon are unknown. This study investigates the distribution patterns of woody plants in Yangmingshan National Park (YMSNP, Northern Taiwan in search of the limitation mechanisms unique to small isolated hills, and to evaluate potential threats under global warming. Forests distributed between 200 to 900 m above sea level (ASL. Remnant forest fragments between 400 and 900 m ASL, have the highest species richness, and should be protected to ensure future forest recovery from the former extensive artificial disturbance. The lower boundary is threatened by urban and agricultural development. The lack of native woody species in these low elevation zones may cause a gap susceptible to invasive species. A consistent forest line at 100 m below mountain tops regardless of elevation suggests a topography-induced instead of an elevation-related limiting mechanism. Therefore, upward-shift of forests, caused by global warming, might be limited at 100 m below hilltops in small isolated hills because of topography-related factors. The spatial range of woody plants along the altitudinal gradient, thus, is likely to become narrower under the combined pressures of global warming, limited elevation, exposure-related stress, and artificial disturbance. Management priorities for forest recovery are suggested to include preservation of remnant forest fragments, increasing forest connectivity, and increasing seedling establishment in the grasslands.

  8. SU-E-J-80: A Comparative Analysis of MIM and Pinnacle Software for Adaptive Planning

    Energy Technology Data Exchange (ETDEWEB)

    Stanford, J; Duggar, W; Morris, B; Yang, C [University of Mississippi Med. Center, Jackson, MS (United States)

    2015-06-15

    Purpose: IMRT treatment is often administered with image guidance and small PTV margins. Change in body habitus such as weight loss and tumor response during the course of a treatment could be significant, thus warranting re-simulation and re-planning. Adaptive planning is challenging and places significant burden on the staff, as such some commercial vendors are now offering adaptive planning software to stream line the process of re-planning and dose accumulation between different CT data set. The purpose of this abstract is to compare the adaptive planning tools between Pinnacle version 9.8 and MIM 6.4 software. Methods: Head and Neck cases of previously treated patients that experienced anatomical changes during the course of their treatment were chosen for evaluation. The new CT data set from the re-simulation was imported to Pinnacle and MIM software. The dynamic planning tool in pinnacle was used to calculate the old plan with fixed MU setting on the new CT data. In MIM, the old CT was registered to the new data set, followed by a dose transformation to the new CT. The dose distribution to the PTV and critical structures from each software were analyzed and compared. Results: 9% difference was observed between the Global maximum doses reported by both software. Mean doses to organs at risk and PTV’s were within 6 % however pinnacle showed greater difference in PTV coverage change. Conclusion: MIM software adaptive planning corrects for geometrical changes without consideration for the effect of radiological path length on dose distribution; however Pinnacle corrects for both geometric and radiological effect on the dose distribution. Pinnacle gives a better estimate of the dosimetric impact due to anatomical changes.

  9. Toward Baseline Software Anomalies in NASA Missions

    Science.gov (United States)

    Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.

    2012-01-01

    In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.

  10. The global distribution and dynamics of surface soil moisture

    Science.gov (United States)

    McColl, Kaighin A.; Alemohammad, Seyed Hamed; Akbar, Ruzbeh; Konings, Alexandra G.; Yueh, Simon; Entekhabi, Dara

    2017-01-01

    Surface soil moisture has a direct impact on food security, human health and ecosystem function. It also plays a key role in the climate system, and the development and persistence of extreme weather events such as droughts, floods and heatwaves. However, sparse and uneven observations have made it difficult to quantify the global distribution and dynamics of surface soil moisture. Here we introduce a metric of soil moisture memory and use a full year of global observations from NASA's Soil Moisture Active Passive mission to show that surface soil moisture--a storage believed to make up less than 0.001% of the global freshwater budget by volume, and equivalent to an, on average, 8-mm thin layer of water covering all land surfaces--plays a significant role in the water cycle. Specifically, we find that surface soil moisture retains a median 14% of precipitation falling on land after three days. Furthermore, the retained fraction of the surface soil moisture storage after three days is highest over arid regions, and in regions where drainage to groundwater storage is lowest. We conclude that lower groundwater storage in these regions is due not only to lower precipitation, but also to the complex partitioning of the water cycle by the surface soil moisture storage layer at the land surface.

  11. Service software engineering for innovative infrastructure for global financial services

    OpenAIRE

    MAAD , Soha; MCCARTHY , James B.; GARBAYA , Samir; Beynon , Meurig; Nagarajan , Rajagopal

    2010-01-01

    International audience; The recent financial crisis motivates our re-thinking of the engineering principles for service software and infrastructures intended to create business value in vital sectors. Existing monolithic, inwarddirected, cost insensitive and highly regulated technical and organizational infrastructures for financial services make it difficult for the domain to benefit from opportunities offered by new computing models such as cloud computing, software as a service, hardware a...

  12. Software piracy: A study of causes, effects and preventive measures

    OpenAIRE

    Khadka, Ishwor

    2015-01-01

    Software piracy is a serious issue that has been affecting software companies for decades. According to Business Software Alliance (BSA), the global software piracy rate in 2013 was 43 percent and the commercial value of unlicensed software installations was $62.7 billion, which resulted in millions of revenues and jobs lost in software companies. The goal of this study was to better understand the software piracy behaviours, how it happens, how it affects to individuals and software compani...

  13. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  14. Global robust stability of neural networks with multiple discrete delays and distributed delays

    International Nuclear Information System (INIS)

    Gao Ming; Cui Baotong

    2009-01-01

    The problem of global robust stability is investigated for a class of uncertain neural networks with both multiple discrete time-varying delays and distributed time-varying delays. The uncertainties are assumed to be of norm-bounded form and the activation functions are supposed to be bounded and globally Lipschitz continuous. Based on the Lyapunov stability theory and linear matrix inequality technique, some robust stability conditions guaranteeing the global robust convergence of the equilibrium point are derived. The proposed LMI-based criteria are computationally efficient as they can be easily checked by using recently developed algorithms in solving LMIs. Two examples are given to show the effectiveness of the proposed results.

  15. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  16. Local and global stability for Lotka-Volterra systems with distributed delays and instantaneous negative feedbacks

    Science.gov (United States)

    Faria, Teresa; Oliveira, José J.

    This paper addresses the local and global stability of n-dimensional Lotka-Volterra systems with distributed delays and instantaneous negative feedbacks. Necessary and sufficient conditions for local stability independent of the choice of the delay functions are given, by imposing a weak nondelayed diagonal dominance which cancels the delayed competition effect. The global asymptotic stability of positive equilibria is established under conditions slightly stronger than the ones required for the linear stability. For the case of monotone interactions, however, sharper conditions are presented. This paper generalizes known results for discrete delays to systems with distributed delays. Several applications illustrate the results.

  17. Ecology and equity in global fisheries: Modelling policy options using theoretical distributions

    NARCIS (Netherlands)

    Rammelt, C.F.; van Schie, Maarten

    2016-01-01

    Global fisheries present a typical case of political ecology or environmental injustice, i.e. a problem of distribution of resources within ecological limits. We built a stock-flow model to visualize this challenge and its dynamics, with both an ecological and a social dimension. We incorporated

  18. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  19. Highly resolved global distribution of tropospheric NO2 using GOME narrow swath mode data

    Directory of Open Access Journals (Sweden)

    S. Beirle

    2004-01-01

    Full Text Available The Global Ozone Monitoring Experiment (GOME allows the retrieval of tropospheric vertical column densities (VCDs of NO2 on a global scale. Regions with enhanced industrial activity can clearly be detected, but the standard spatial resolution of the GOME ground pixels (320x40km2 is insufficient to resolve regional trace gas distributions or individual cities. Every 10 days within the nominal GOME operation, measurements are executed in the so called narrow swath mode with a much better spatial resolution (80x40km2. We use this data (1997-2001 to construct a detailed picture of the mean global tropospheric NO2 distribution. Since - due to the narrow swath - the global coverage of the high resolution observations is rather poor, it has proved to be essential to deseasonalize the single narrow swath mode observations to retrieve adequate mean maps. This is done by using the GOME backscan information. The retrieved high resolution map illustrates the shortcomings of the standard size GOME pixels and reveals an unprecedented wealth of details in the global distribution of tropospheric NO2. Localised spots of enhanced NO2 VCD can be directly associated to cities, heavy industry centers and even large power plants. Thus our result helps to check emission inventories. The small spatial extent of NO2 'hot spots' allows us to estimate an upper limit of the mean lifetime of boundary layer NOx of 17h on a global scale. The long time series of GOME data allows a quantitative comparison of the narrow swath mode data to the nominal resolution. Thus we can analyse the dependency of NO2 VCDs on pixel size. This is important for comparing GOME data to results of new satellite instruments like SCIAMACHY (launched March 2002 on ENVISAT, OMI (launched July 2004 on AURA or GOME II (to be launched 2005 with an improved spatial resolution.

  20. Assessment of best practice of software development in developing ...

    African Journals Online (AJOL)

    ... Understand the technology of the software (4.03), Memory limit set (3.91), Application pool not shared (3.88) and other parameters were examined for software development. The analysis shows the variance of the assessment of best practices in Software development firms and they are in conformity with the global trend.

  1. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  2. Persistent Discontinuities in Global Software Development Teams: Adaption through Closely Coupled Work Practices

    DEFF Research Database (Denmark)

    Jensen, Rasmus Eskild

    this as a starting point, it is clear that researchers still know little about how practitioners adjust and adapt to persistent discontinuities in globally distributed teams or how practitioners coordinate the work to bridge persistent discontinuities. Investigating the data material from an ethnographic work place...... and personal connections on several levels. These connections made the team more resistant to frequent changes in the team composition and made it easier to trace commitment in the everyday work, which was essential for completing the task. In conclusion, the dissertation found that changes...

  3. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    Science.gov (United States)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  4. A Reference Architecture for Distributed Software Deployment

    NARCIS (Netherlands)

    Van der Burg, S.

    2013-01-01

    Nowadays, software systems are bigger and more complicated than people may think. Apart from the fact that a system has to be correctly constructed and should meet the client's wishes, they also have to be made ready for use to end-users or in an isolated test environment. This process is known as

  5. Global Distribution of Two Fungal Pathogens Threatening Endangered Sea Turtles

    OpenAIRE

    Sarmiento-Ramírez, Jullie M.; Abella-Pérez, Elena; Phillott, Andrea D.; Sim, Jolene; van West, Pieter; Martín, María P.; Marco, Adolfo; Diéguez-Uribeondo, Javier

    2014-01-01

    Nascent fungal infections are currently considered as one of the main threats for biodiversity and ecosystem health, and have driven several animal species into critical risk of extinction. Sea turtles are one of the most endangered groups of animals and only seven species have survived to date. Here, we described two pathogenic species, i.e., Fusarium falciforme and Fusarium keratoplasticum, that are globally distributed in major turtle nesting areas for six sea turtle species and that are i...

  6. Employing peer-to-peer software distribution in ALICE Grid Services to enable opportunistic use of OSG resources

    CERN Multimedia

    CERN. Geneva; Sakrejda, Iwona

    2012-01-01

    The ALICE Grid infrastructure is based on AliEn, a lightweight open source framework built on Web Services and a Distributed Agent Model in which job agents are submitted onto a grid site to prepare the environment and pull work from a central task queue located at CERN. In the standard configuration, each ALICE grid site supports an ALICE-specific VO box as a single point of contact between the site and the ALICE central services. VO box processes monitor site utilization and job requests (ClusterMonitor), monitor dynamic job and site properties (MonaLisa), perform job agent submission (CE) and deploy job-specific software (PackMan). In particular, requiring a VO box at each site simplifies deployment of job software, done onto a shared file system at the site, and adds redundancy to the overall Grid system. ALICE offline computing, however, has also implemented a peer-to-peer method (based on BitTorrent) for downloading job software directly onto each worker node as needed. By utilizing both this peer-...

  7. The global distribution of fatal pesticide self-poisoning: systematic review

    DEFF Research Database (Denmark)

    Gunnell, David; Eddleston, Michael; Phillips, Michael R

    2007-01-01

    BACKGROUND: Evidence is accumulating that pesticide self-poisoning is one of the most commonly used methods of suicide worldwide, but the magnitude of the problem and the global distribution of these deaths is unknown. METHODS: We have systematically reviewed the worldwide literature to estimate......-poisoning worldwide each year, accounting for 30% (range 27% to 37%) of suicides globally. Official data from India probably underestimate the incidence of suicides; applying evidence-based corrections to India's official data, our estimate for world suicides using pesticides increases to 371,594 (range 347......, not the quantity used, that influences the likelihood they will be used in acts of fatal self-harm. CONCLUSION: Pesticide self-poisoning accounts for about one-third of the world's suicides. Epidemiological and toxicological data suggest that many of these deaths might be prevented if (a) the use of pesticides...

  8. Software Quality Measurement for Distributed Systems. Volume 3. Distributed Computing Systems: Impact on Software Quality.

    Science.gov (United States)

    1983-07-01

    Distributed Computing Systems impact DrnwrR - aehR on Sotwar Quaity. PERFORMING 010. REPORT NUMBER 7. AUTNOW) S. CONTRACT OR GRANT "UMBER(*)IS ThomasY...C31 Application", "Space Systems Network", "Need for Distributed Database Management", and "Adaptive Routing". This is discussed in the last para ...data reduction, buffering, encryption, and error detection and correction functions. Examples of such data streams include imagery data, video

  9. The importance of the human footprint in shaping the global distribution of terrestrial, freshwater and marine invaders.

    Directory of Open Access Journals (Sweden)

    Belinda Gallardo

    Full Text Available Human activities such as transport, trade and tourism are likely to influence the spatial distribution of non-native species and yet, Species Distribution Models (SDMs that aim to predict the future broad scale distribution of invaders often rely on environmental (e.g. climatic information only. This study investigates if and to what extent do human activities that directly or indirectly influence nature (hereafter the human footprint affect the global distribution of invasive species in terrestrial, freshwater and marine ecosystems. We selected 72 species including terrestrial plants, terrestrial animals, freshwater and marine invasive species of concern in a focus area located in NW Europe (encompassing Great Britain, France, The Netherlands and Belgium. Species Distribution Models were calibrated with the global occurrence of species and a set of high-resolution (9×9 km environmental (e.g. topography, climate, geology layers and human footprint proxies (e.g. the human influence index, population density, road proximity. Our analyses suggest that the global occurrence of a wide range of invaders is primarily limited by climate. Temperature tolerance was the most important factor and explained on average 42% of species distribution. Nevertheless, factors related to the human footprint explained a substantial amount (23% on average of species distributions. When global models were projected into the focus area, spatial predictions integrating the human footprint featured the highest cumulative risk scores close to transport networks (proxy for invasion pathways and in habitats with a high human influence index (proxy for propagule pressure. We conclude that human related information-currently available in the form of easily accessible maps and databases-should be routinely implemented into predictive frameworks to inform upon policies to prevent and manage invasions. Otherwise we might be seriously underestimating the species and areas under

  10. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  11. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  12. Building flexible, distributed collaboration tools using type-based publish/subscribe - The Distributed Knight case

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Damm, Christian Heide

    2004-01-01

    Distributed collaboration is becoming increasingly impor tant also in software development. Combined with an in creasing interest in experimental and agile approaches to software development, this poses challenges to tool sup port for software development. Specifically, tool support is needed...... for flexible, distributed collaboration. We intro duce the Distributed Knight tool that provides flexible and lightweight support for distributed collaboration in object oriented modelling. The Distributed Knight implementa tion builds crucially on the type-based publish/subscribe distributed communication...... paradigm, which provides an effective and natural abstraction for developing distributed collaboration tools....

  13. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    Science.gov (United States)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  14. ReSOLV: Applying Cryptocurrency Blockchain Methods to Enable Global Cross-Platform Software License Validation

    Directory of Open Access Journals (Sweden)

    Alan Litchfield

    2018-05-01

    Full Text Available This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970s and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.

  15. Tropospheric Ozone Assessment Report: Assessment of global-scale model performance for global and regional ozone distributions, variability, and trends

    Directory of Open Access Journals (Sweden)

    P. J. Young

    2018-01-01

    Full Text Available The goal of the Tropospheric Ozone Assessment Report (TOAR is to provide the research community with an up-to-date scientific assessment of tropospheric ozone, from the surface to the tropopause. While a suite of observations provides significant information on the spatial and temporal distribution of tropospheric ozone, observational gaps make it necessary to use global atmospheric chemistry models to synthesize our understanding of the processes and variables that control tropospheric ozone abundance and its variability. Models facilitate the interpretation of the observations and allow us to make projections of future tropospheric ozone and trace gas distributions for different anthropogenic or natural perturbations. This paper assesses the skill of current-generation global atmospheric chemistry models in simulating the observed present-day tropospheric ozone distribution, variability, and trends. Drawing upon the results of recent international multi-model intercomparisons and using a range of model evaluation techniques, we demonstrate that global chemistry models are broadly skillful in capturing the spatio-temporal variations of tropospheric ozone over the seasonal cycle, for extreme pollution episodes, and changes over interannual to decadal periods. However, models are consistently biased high in the northern hemisphere and biased low in the southern hemisphere, throughout the depth of the troposphere, and are unable to replicate particular metrics that define the longer term trends in tropospheric ozone as derived from some background sites. When the models compare unfavorably against observations, we discuss the potential causes of model biases and propose directions for future developments, including improved evaluations that may be able to better diagnose the root cause of the model-observation disparity. Overall, model results should be approached critically, including determining whether the model performance is acceptable for

  16. Code HEX-Z-DMG for support of accounting for and control of nuclear material software system as part of international safeguards system at BN-350 site

    International Nuclear Information System (INIS)

    Bushmakin, A.G.; Schaefer, B.

    1999-01-01

    A code for the computation of the global neutron distribution in the three-dimensional hexagonal-z geometry and multi-group diffusion approximation was developed at BN-350 as the main part of the BN-350 accounting for and control of nuclear material software system. This software system includes: the model for stationary distributions of neutrons; the model to calculate isotope compositions changing; the model of refueling operations; To develop this system next two principal problems were solved: to make a micro cross sections library for all nuclides for the BN-350 reactor core; to develop the code for the computation of the global neutron distribution. To solve first task the twenty-six-energy-groups micro cross sections library for more than seventy nuclides was produced. To solve second task the three-dimensional hexagonal-z geometry and multi-group diffusion approximation code was developed. This code (HEX-Z-DMG) was based on the solution of the multi groups diffusion equation using the standard net approach. The series of calculations was performed in the twenty-six-energy-groups representation using this code. We compared eigenvalues (k eff ), a worth added during refueling operations, spatial and energy-group-dependent neutron flux distributions with results of calculation using other code (DIF3D). After the series of these calculations we can say that the HEX-Z-DMG code is well established to use as the part of the BN-350 accounting for and control of nuclear material software system. (author)

  17. Global distributions of water vapour isotopologues retrieved from IMG/ADEOS data

    Directory of Open Access Journals (Sweden)

    H. Herbin

    2007-07-01

    Full Text Available The isotopologic composition of water vapour in the atmosphere provides valuable information on many climate, chemical and dynamical processes. The accurate measurements of the water isotopologues by remote-sensing techniques remains a challenge, due to the large spatial and temporal variations. Simultaneous profile retrievals of the main water isotopologues (i.e. H216O, H218O and HDO and their ratios are presented here for the first time, along their retrieved global distributions. The results are obtained by exploiting the high resolution infrared spectra recorded by the Interferometric Monitor for Greenhouse gases (IMG instrument, which has operated in the nadir geometry onboard the ADEOS satellite between 1996 and 1997. The retrievals are performed on cloud-free radiances, measured during ten days of April 1997, considering two atmospheric windows (1205–1228 cm−1; 2004–2032 cm−1 and using a line-by-line radiative transfer model and an inversion procedure based on the Optimal Estimation Method (OEM. Characterizations in terms of vertical sensitivity and error budget are provided. We show that a relatively high vertical resolution is achieved for H216O (~4–5 km, and that the retrieved profiles are in fair agreement with local sonde measurements, at different latitudes. The retrieved global distributions of H216O, H218O, HDO and their ratios are presented and found to be consistent with previous experimental studies and models. The Ocean-Continent difference, the latitudinal and vertical dependence of the water vapour amount and the isotopologic depletion are notably well reproduced. Others trends, possibly related to small-scale variations in the vertical profiles are also discussed. Despite the difficulties encountered for computing accurately the isotopologic ratios, our results demonstrate the ability

  18. Modelling global distribution, risk and mitigation strategies of floating plastic pollution

    Science.gov (United States)

    van Sebille, Erik; Wilcox, Chris; Sherman, Peter; Hardesty, Britta Denise; Lavender Law, Kara

    2016-04-01

    Microplastic debris floating at the ocean surface can harm marine life. Understanding the severity of this harm requires knowledge of plastic abundance and distributions. Dozens of expeditions measuring microplastics have been carried out since the 1970s, but they have primarily focused on the North Pacific and North Atlantic accumulation zones, with much sparser coverage elsewhere. Here, we use the largest dataset of microplastic measurements assembled to date to assess the confidence we can have in global estimates of microplastic abundance and mass. We use a rigorous statistical framework to standardise a global dataset of plastic marine debris measured using surface-trawling plankton nets and couple this with three different ocean circulation models to spatially interpolate the observations. Our estimates show that the accumulated number of microplastic particles in 2014 ranges from 15 to 51 trillion particles, weighing between 93 and 236 thousand metric tons. A large fraction of the uncertainty in these estimates comes from sparse sampling in coastal and Southern Hemisphere regions. We then use this global distribution of small floating plastic debris to map out where in the ocean the risk to marine life (in particular seabirds and plankton growth) is greatest, using a quantitative risk framework. We show that the largest risk occurs not necessarily in regions of high plastic concentration, but rather in regions of extensive foraging with medium-high plastic concentrations such as coastal upwelling regions and the Southern Ocean. Finally, we use the estimates of distribution to investigate where in the ocean plastic can most optimally be removed, assuming hypothetical clean-up booms following the ideas from The Ocean Cleanup project. We show that mitigation of the plastic problem can most aptly be done near coastlines, particularly in Asia, rather than in the centres of the gyres. Based on these results, we propose more focus on the coastal zones when

  19. Global distributions of cloud properties for CERES

    Science.gov (United States)

    Sun-Mack, S.; Minnis, P.; Heck, P.; Young, D.

    2003-04-01

    The microphysical and macrophysical properties of clouds play a crucial role in the earth's radiation budget. Simultaneous measurement of the radiation and cloud fields on a global basis has long been recognized as a key component in understanding and modeling the interaction between clouds and radiation at the top of the atmosphere, at the surface, and within the atmosphere. With the implementation of the NASA Clouds and Earth's Radiant Energy System (CERES) in 1998, this need is being met. Broadband shortwave and longwave radiance measurements taken by the CERES scanners at resolutions between 10 and 20 km on the Tropical Rainfall Measuring Mission (TRMM), Terra, and Aqua satellites are matched to simultaneous retrievals of cloud height, phase, particle size, water path, and optical depth from the TRMM Visible Infrared Scanner and the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua. The combined cloud-radiation product has already been used for developing new, highly accurate anisotropic directional models for converting broadband radiances to flux. They also provide a consistent measure of cloud properties at different times of day over the globe since January 1998. These data will be valuable for determining the indirect effects of aerosols and for linking cloud water to cloud radiation. This paper provides an overview of the CERES cloud products from the three satellites including the retrieval methodology, validation, and global distributions. Availability and access to the datasets will also be discussed.

  20. Global synchronization algorithms for the Intel iPSC/860

    Science.gov (United States)

    Seidel, Steven R.; Davis, Mark A.

    1992-01-01

    In a distributed memory multicomputer that has no global clock, global processor synchronization can only be achieved through software. Global synchronization algorithms are used in tridiagonal systems solvers, CFD codes, sequence comparison algorithms, and sorting algorithms. They are also useful for event simulation, debugging, and for solving mutual exclusion problems. For the Intel iPSC/860 in particular, global synchronization can be used to ensure the most effective use of the communication network for operations such as the shift, where each processor in a one-dimensional array or ring concurrently sends a message to its right (or left) neighbor. Three global synchronization algorithms are considered for the iPSC/860: the gysnc() primitive provided by Intel, the PICL primitive sync0(), and a new recursive doubling synchronization (RDS) algorithm. The performance of these algorithms is compared to the performance predicted by communication models of both the long and forced message protocols. Measurements of the cost of shift operations preceded by global synchronization show that the RDS algorithm always synchronizes the nodes more precisely and costs only slightly more than the other two algorithms.

  1. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  2. Global Distributions of Temperature Variances At Different Stratospheric Altitudes From Gps/met Data

    Science.gov (United States)

    Gavrilov, N. M.; Karpova, N. V.; Jacobi, Ch.

    The GPS/MET measurements at altitudes 5 - 35 km are used to obtain global distribu- tions of small-scale temperature variances at different stratospheric altitudes. Individ- ual temperature profiles are smoothed using second order polynomial approximations in 5 - 7 km thick layers centered at 10, 20 and 30 km. Temperature inclinations from the averaged values and their variances obtained for each profile are averaged for each month of year during the GPS/MET experiment. Global distributions of temperature variances have inhomogeneous structure. Locations and latitude distributions of the maxima and minima of the variances depend on altitudes and season. One of the rea- sons for the small-scale temperature perturbations in the stratosphere could be internal gravity waves (IGWs). Some assumptions are made about peculiarities of IGW gener- ation and propagation in the tropo-stratosphere based on the results of GPS/MET data analysis.

  3. The impact of global warming on the range distribution of different climatic groups of Aspidoscelis costata costata.

    Science.gov (United States)

    Güizado-Rodríguez, Martha Anahí; Ballesteros-Barrera, Claudia; Casas-Andreu, Gustavo; Barradas-Miranda, Victor Luis; Téllez-Valdés, Oswaldo; Salgado-Ugarte, Isaías Hazarmabeth

    2012-12-01

    The ectothermic nature of reptiles makes them especially sensitive to global warming. Although climate change and its implications are a frequent topic of detailed studies, most of these studies are carried out without making a distinction between populations. Here we present the first study of an Aspidoscelis species that evaluates the effects of global warming on its distribution using ecological niche modeling. The aims of our study were (1) to understand whether predicted warmer climatic conditions affect the geographic potential distribution of different climatic groups of Aspidoscelis costata costata and (2) to identify potential altitudinal changes of these groups under global warming. We used the maximum entropy species distribution model (MaxEnt) to project the potential distributions expected for the years 2020, 2050, and 2080 under a single simulated climatic scenario. Our analysis suggests that some climatic groups of Aspidoscelis costata costata will exhibit reductions and in others expansions in their distribution, with potential upward shifts toward higher elevation in response to climate warming. Different climatic groups were revealed in our analysis that subsequently showed heterogeneous responses to climatic change illustrating the complex nature of species geographic responses to environmental change and the importance of modeling climatic or geographic groups and/or populations instead of the entire species' range treated as a homogeneous entity.

  4. ETICS: the international software engineering service for the grid

    Energy Technology Data Exchange (ETDEWEB)

    Meglio, A D; Begin, M-E [CERN (Switzerland); Couvares, P [University of Wisconsin-Madison (United States); Ronchieri, E [INFN CNAF (Italy); Takacs, E [4D SOFT Ltd (Hungary)], E-mail: alberto.di.meglio@cern.ch

    2008-07-15

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  5. ETICS: the international software engineering service for the grid

    Science.gov (United States)

    Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.

    2008-07-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  6. ETICS: the international software engineering service for the grid

    International Nuclear Information System (INIS)

    Meglio, A D; Begin, M-E; Couvares, P; Ronchieri, E; Takacs, E

    2008-01-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself

  7. A Combined Approach for Component-based Software Design

    NARCIS (Netherlands)

    Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis; Quartel, Dick; Baldoni, R.

    2001-01-01

    Component-based software development enables the construction of software artefacts by assembling binary units of production, distribution and deployment, the so-called software components. Several approaches addressing component-based development have been proposed recently. Most of these

  8. Automatic Type Recognition and Mapping of Global Tropical Cyclone Disaster Chains (TDC

    Directory of Open Access Journals (Sweden)

    Ran Wang

    2016-10-01

    Full Text Available The catastrophic events caused by meteorological disasters are becoming more severe in the context of global warming. The disaster chains triggered by Tropical Cyclones induce the serious losses of population and economy. It is necessary to make the regional type recognition of Tropical Cyclone Disaster Chain (TDC effective in order to make targeted preventions. This study mainly explores the method of automatic recognition and the mapping of TDC and designs a software system. We constructed an automatic recognition system in terms of the characteristics of a hazard-formative environment based on the theory of a natural disaster system. The ArcEngine components enable an intelligent software system to present results by the automatic mapping approach. The study data comes from global metadata such as Digital Elevation Model (DEM, terrain slope, population density and Gross Domestic Product (GDP. The result shows that: (1 according to the characteristic of geomorphology type, we establish a type of recognition system for global TDC; (2 based on the recognition principle, we design a software system with the functions of automatic recognition and mapping; and (3 we validate the type of distribution in terms of real cases of TDC. The result shows that the automatic recognition function has good reliability. The study can provide the basis for targeted regional disaster prevention strategy, as well as regional sustainable development.

  9. Implementing the Gaia Astrometric Global Iterative Solution (AGIS) in Java

    OpenAIRE

    O'Mullane, William; Lammers, Uwe; Lindegren, Lennart; Hernandez, Jose; Hobbs, David

    2011-01-01

    This paper provides a description of the Java software framework which has been constructed to run the Astrometric Global Iterative Solution for the Gaia mission. This is the mathematical framework to provide the rigid reference frame for Gaia observations from the Gaia data itself. This process makes Gaia a self calibrated, and input catalogue independent, mission. The framework is highly distributed typically running on a cluster of machines with a database back end. All code is written in ...

  10. Global asymptotic stability of Cohen-Grossberg neural network with continuously distributed delays

    International Nuclear Information System (INIS)

    Wan Li; Sun Jianhua

    2005-01-01

    The convergence dynamical behaviors of Cohen-Grossberg neural network with continuously distributed delays are discussed. By using Brouwer's fixed point theorem, matrix theory and analysis techniques such as Gronwall inequality, some new sufficient conditions guaranteeing the existence, uniqueness of an equilibrium point and its global asymptotic stability are obtained. An example is given to illustrate the theoretical results

  11. Software testing for evolutionary iterative rapid prototyping

    OpenAIRE

    Davis, Edward V., Jr.

    1990-01-01

    Approved for public release; distribution unlimited. Rapid prototyping is emerging as a promising software development paradigm. It provides a systematic and automatable means of developing a software system under circumstances where initial requirements are not well known or where requirements change frequently during development. To provide high software quality assurance requires sufficient software testing. The unique nature of evolutionary iterative prototyping is not well-suited for ...

  12. Quantifying the global and distributional aspects of American household carbon footprint

    International Nuclear Information System (INIS)

    Weber, Christopher L.; Matthews, H. Scott

    2008-01-01

    Analysis of household consumption and its environmental impact remains one of the most important topics in sustainability research. Nevertheless, much past and recent work has focused on domestic national averages, neglecting both the growing importance of international trade on household carbon footprint and the variation between households of different income levels and demographics. Using consumer expenditure surveys and multi-country life cycle assessment techniques, this paper analyzes the global and distributional aspects of American household carbon footprint. We find that due to recently increased international trade, 30% of total US household CO 2 impact in 2004 occurred outside the US. Further, households vary considerably in their CO 2 responsibilities: at least a factor of ten difference exists between low and high-impact households, with total household income and expenditure being the best predictors of both domestic and international portions of the total CO 2 impact. The global location of emissions, which cannot be calculated using standard input-output analysis, and the variation of household impacts with income, have important ramifications for polices designed to lower consumer impacts on climate change, such as carbon taxes. The effectiveness and fairness of such policies hinges on a proper understanding of how income distributions, rebound effects, and international trade affect them. (author)

  13. Global Distribution of Active Volcanism on Io as Known at the End of the Galileo Mission

    Science.gov (United States)

    Lopes, Rosaly M. C.; Kamp. Lucas W.; Smythe, W. D.; Radebaugh, J.; Turtle, E.; Perry, J.; Bruno, B.

    2004-01-01

    Hot spots are manifestations of Io s mechanism of internal heating and heat transfer. Therefore, the global distribution of hot spots and their power output has important implications for how Io is losing heat. The end of the Galileo mission is an opportune time to revisit studies of the distribution of hot spots on Io, and to investigate the distribution of their power output.

  14. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua; Prasanna, Viktor K.

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Grid Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.

  15. Measurement of software project management effectiveness

    OpenAIRE

    Demir, Kadir Alpaslan

    2008-01-01

    Approved for public release; distribution is unlimited. Evaluating, monitoring, and improving the effectiveness of project management can contribute to successful acquisition of software systems. In this dissertation, we introduce a quantitative metric for gauging the effectiveness of managing a software-development project. The metric may be used to evaluate and monitor project management effectiveness in software projects by project managers, technical managers, executive man...

  16. Free for All: Open Source Software

    Science.gov (United States)

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  17. The global distribution of ammonia emissions from seabird colonies

    Science.gov (United States)

    Riddick, S. N.; Dragosits, U.; Blackall, T. D.; Daunt, F.; Wanless, S.; Sutton, M. A.

    2012-08-01

    Seabird colonies represent a significant source of atmospheric ammonia (NH3) in remote maritime systems, producing a source of nitrogen that may encourage plant growth, alter terrestrial plant community composition and affect the surrounding marine ecosystem. To investigate seabird NH3 emissions on a global scale, we developed a contemporary seabird database including a total seabird population of 261 million breeding pairs. We used this in conjunction with a bioenergetics model to estimate the mass of nitrogen excreted by all seabirds at each breeding colony. The results combined with the findings of mid-latitude field studies of volatilization rates estimate the global distribution of NH3 emissions from seabird colonies on an annual basis. The largest uncertainty in our emission estimate concerns the potential temperature dependence of NH3 emission. To investigate this we calculated and compared temperature independent emission estimates with a maximum feasible temperature dependent emission, based on the thermodynamic dissociation and solubility equilibria. Using the temperature independent approach, we estimate global NH3 emissions from seabird colonies at 404 Gg NH3 per year. By comparison, since most seabirds are located in relatively cold circumpolar locations, the thermodynamically dependent estimate is 136 Gg NH3 per year. Actual global emissions are expected to be within these bounds, as other factors, such as non-linear interactions with water availability and surface infiltration, moderate the theoretical temperature response. Combining sources of error from temperature (±49%), seabird population estimates (±36%), variation in diet composition (±23%) and non-breeder attendance (±13%), gives a mid estimate with an overall uncertainty range of NH3 emission from seabird colonies of 270 [97-442] Gg NH3 per year. These emissions are environmentally relevant as they primarily occur as "hot-spots" in otherwise pristine environments with low anthropogenic

  18. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  19. Global stability of stochastic high-order neural networks with discrete and distributed delays

    International Nuclear Information System (INIS)

    Wang Zidong; Fang Jianan; Liu Xiaohui

    2008-01-01

    High-order neural networks can be considered as an expansion of Hopfield neural networks, and have stronger approximation property, faster convergence rate, greater storage capacity, and higher fault tolerance than lower-order neural networks. In this paper, the global asymptotic stability analysis problem is considered for a class of stochastic high-order neural networks with discrete and distributed time-delays. Based on an Lyapunov-Krasovskii functional and the stochastic stability analysis theory, several sufficient conditions are derived, which guarantee the global asymptotic convergence of the equilibrium point in the mean square. It is shown that the stochastic high-order delayed neural networks under consideration are globally asymptotically stable in the mean square if two linear matrix inequalities (LMIs) are feasible, where the feasibility of LMIs can be readily checked by the Matlab LMI toolbox. It is also shown that the main results in this paper cover some recently published works. A numerical example is given to demonstrate the usefulness of the proposed global stability criteria

  20. On numerical simulation of the global distribution of sulfate aerosol produced by a large volcanic eruption

    Energy Technology Data Exchange (ETDEWEB)

    Pudykiewicz, J.A.; Dastoor, A.P. [Atmospheric Environment Service, Quebec (Canada)

    1994-12-31

    Volcanic eruptions play an important role in the global sulfur cycle of the Earth`s atmosphere and can significantly perturb the global atmospheric chemistry. The large amount of sulfate aerosol produced by the oxidation of SO{sub 2} injected into the atmosphere during volcanic eruptions also has a relatively big influence on the radiative equilibrium of the Earth`s climatic system. The submicron particles of the sulfate aerosol reflect solar radiation more effectively than they trap radiation in the infrared range. The effect of this is observed as cooling of the Earth`s surface. The modification of the global radiation budget following volcanic eruption can subsequently cause significant fluctuations of atmospheric variables on a subclimatic scale. The resulting perturbation of weather patterns has been observed and well documented since the eruptions of Mt. Krakatau and Mt. Tambora. The impact of the sulfate aerosol from volcanic eruptions on the radiative equilibrium of the Earth`s atmosphere was also confirmed by the studies done with Global Circulation Models designed to simulate climate. The objective of the present paper is to present a simple and effective method to estimate the global distribution of the sulfate aerosol produced as a consequence of volcanic eruptions. In this study we will present results of the simulation of global distribution of sulfate aerosol from the eruption of Mt Pinatubo.

  1. The software environment of RODOS

    International Nuclear Information System (INIS)

    Schuele, O.; Rafat, M.; Kossykh, V.

    1996-01-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorized in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system

  2. The software environment of RODOS

    International Nuclear Information System (INIS)

    Schuele, O.; Rafat, M.

    1998-01-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorised in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system. (orig.)

  3. The software environment of RODOS

    Energy Technology Data Exchange (ETDEWEB)

    Schuele, O; Rafat, M [Forschungszentrum Karlsruhe, Institut fuer Neutronenphysik und Reaktortechnik, Karlsruhe (Germany); Kossykh, V [Scientific Production Association ' TYPHOON' , Emergency Centre, Obninsk (Russian Federation)

    1996-07-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorized in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system.

  4. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1993-01-01

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  5. Adaptation of Black-Box Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2008-01-01

    Full Text Available The globalization of the software market leads to crucial problems for software companies. More competition between software companies arises and leads to the force on companies to develop ever newer software products in ever shortened time interval. Therefore the time to market for software systems is shortened and obviously the product life cycle is shortened too. Thus software companies shortened the time interval for research and development. Due to the fact of competition between software companies software products have to develop low-priced and this leads to a smaller return on investment. A big challenge for software companies is the use of an effective research and development process to have these problems under control. A way to control these problems can be the reuse of existing software components and adapt those software components to new functionality or accommodate mismatched interfaces. Complete redevelopment of software products is more expensive and time consuming than to develop software components. The approach introduced here presents novel technique together with a supportive environment that enables developers to cope with the adaptability of black-box software components. A supportive environment will be designed that checks the compatibility of black-box software components with the assistance of their specifications. Generated adapter software components can take over the part of adaptation and advance the functionality. Besides, a pool of software components can be used to compose an application to satisfy customer needs. Certainly this pool of software components consists of black-box software components and adapter software components which can be connected on demand.

  6. Global Mobile Satellite Service Interference Analysis for the AeroMACS

    Science.gov (United States)

    Wilson, Jeffrey D.; Apaza, Rafael D.; Hall, Ward; Phillips, Brent

    2013-01-01

    The AeroMACS (Aeronautical Mobile Airport Communications System), which is based on the IEEE 802.16-2009 mobile wireless standard, is envisioned as the wireless network which will cover all areas of airport surfaces for next generation air transportation. It is expected to be implemented in the 5091-5150 MHz frequency band which is also occupied by mobile satellite service uplinks. Thus the AeroMACS must be designed to avoid interference with this incumbent service. Simulations using Visualyse software were performed utilizing a global database of 6207 airports. Variations in base station and subscriber antenna distribution and gain pattern were examined. Based on these simulations, recommendations for global airport base station and subscriber antenna power transmission limitations are provided.

  7. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  8. Evaluation and selection of security products for authentication of computer software

    Science.gov (United States)

    Roenigk, Mark W.

    2000-04-01

    Software Piracy is estimated to cost software companies over eleven billion dollars per year in lost revenue worldwide. Over fifty three percent of all intellectual property in the form of software is pirated on a global basis. Software piracy has a dramatic effect on the employment figures for the information industry as well. In the US alone, over 130,000 jobs are lost annually as a result of software piracy.

  9. Spatial and temporal patterns of global onshore wind speed distribution

    International Nuclear Information System (INIS)

    Zhou, Yuyu; Smith, Steven J

    2013-01-01

    Wind power, a renewable energy source, can play an important role in electrical energy generation. Information regarding wind energy potential is important both for energy related modeling and for decision-making in the policy community. While wind speed datasets with high spatial and temporal resolution are often ultimately used for detailed planning, simpler assumptions are often used in analysis work. An accurate representation of the wind speed frequency distribution is needed in order to properly characterize wind energy potential. Using a power density method, this study estimated global variation in wind parameters as fitted to a Weibull density function using NCEP/climate forecast system reanalysis (CFSR) data over land areas. The Weibull distribution performs well in fitting the time series wind speed data at most locations according to R 2 , root mean square error, and power density error. The wind speed frequency distribution, as represented by the Weibull k parameter, exhibits a large amount of spatial variation, a regionally varying amount of seasonal variation, and relatively low decadal variation. We also analyzed the potential error in wind power estimation when a commonly assumed Rayleigh distribution (Weibull k = 2) is used. We find that the assumption of the same Weibull parameter across large regions can result in non-negligible errors. While large-scale wind speed data are often presented in the form of mean wind speeds, these results highlight the need to also provide information on the wind speed frequency distribution. (letter)

  10. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  11. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  12. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  13. BETR Global - A geographically explicit global-scale multimedia contaminant fate model

    Energy Technology Data Exchange (ETDEWEB)

    Macleod, M.; Waldow, H. von; Tay, P.; Armitage, J. M.; Wohrnschimmel, H.; Riley, W.; McKone, T. E.; Hungerbuhler, K.

    2011-04-01

    We present two new software implementations of the BETR Global multimedia contaminant fate model. The model uses steady-state or non-steady-state mass-balance calculations to describe the fate and transport of persistent organic pollutants using a desktop computer. The global environment is described using a database of long-term average monthly conditions on a 15{sup o} x 15{sup o} grid. We demonstrate BETR Global by modeling the global sources, transport, and removal of decamethylcyclopentasiloxane (D5).

  14. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  15. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  16. Global Distribution, Public Health and Clinical Impact of the Protozoan Pathogen Cryptosporidium

    Directory of Open Access Journals (Sweden)

    Lorenza Putignani

    2010-01-01

    Full Text Available Cryptosporidium spp. are coccidians, oocysts-forming apicomplexan protozoa, which complete their life cycle both in humans and animals, through zoonotic and anthroponotic transmission, causing cryptosporidiosis. The global burden of this disease is still underascertained, due to a conundrum transmission modality, only partially unveiled, and on a plethora of detection systems still inadequate or only partially applied for worldwide surveillance. In children, cryptosporidiosis encumber is even less recorded and often misidentified due to physiological reasons such as early-age unpaired immunological response. Furthermore, malnutrition in underdeveloped countries or clinical underestimation of protozoan etiology in developed countries contribute to the underestimation of the worldwide burden. Principal key indicators of the parasite distribution were associated to environmental (e.g., geographic and temporal clusters, etc. and host determinants of the infection (e.g., age, immunological status, travels, community behaviours. The distribution was geographically mapped to provide an updated picture of the global parasite ecosystems. The present paper aims to provide, by a critical analysis of existing literature, a link between observational epidemiological records and new insights on public health, and diagnostic and clinical impact of cryptosporidiosis.

  17. Designing and Implementing a Distributed System Architecture for the Mars Rover Mission Planning Software (Maestro)

    Science.gov (United States)

    Goldgof, Gregory M.

    2005-01-01

    Distributed systems allow scientists from around the world to plan missions concurrently, while being updated on the revisions of their colleagues in real time. However, permitting multiple clients to simultaneously modify a single data repository can quickly lead to data corruption or inconsistent states between users. Since our message broker, the Java Message Service, does not ensure that messages will be received in the order they were published, we must implement our own numbering scheme to guarantee that changes to mission plans are performed in the correct sequence. Furthermore, distributed architectures must ensure that as new users connect to the system, they synchronize with the database without missing any messages or falling into an inconsistent state. Robust systems must also guarantee that all clients will remain synchronized with the database even in the case of multiple client failure, which can occur at any time due to lost network connections or a user's own system instability. The final design for the distributed system behind the Mars rover mission planning software fulfills all of these requirements and upon completion will be deployed to MER at the end of 2005 as well as Phoenix (2007) and MSL (2009).

  18. Software for the decision making support on the design of natural gas distribution networks; Software de apoio a decisao para o projeto de rede urbanas de distribuicao de gas natural

    Energy Technology Data Exchange (ETDEWEB)

    Goldbarg, Marco C.; Goldbarg, Elizabeth F.G. [Universidade Federal do Rio Grande do Norte (UFRN), Natal, RN (Brazil); Campos, Michel F. [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2004-07-01

    This work presents a computational system to aid the decision making process of installing new networks to distribute natural gas in an urban area. The system is called POM-DIGAS. The purpose of the software is to optimize the design of natural gas distribution networks. The general optimization problem comprises two combinatorial problems. The first one refers to the definition of the network layout. In this problem the objective is to minimize the total length of the network. The second combinatorial problem considers the pipe size optimization in which one must choose the diameters of the pipes regarding the demand requirements. POM-DIGAS is a composite of models and algorithms developed to tackle the two combinatorial problems. Furthermore, the software has a geographic information mode, a tool to automatically acquire several types of data concerning the project and a mode with distinct flow equations in order to allow the utilization of different methodologies for computing the network flows. The system was applied to a case study developed for the city of Natal, Rio Grande do Norte. This work was supported by RedeGasEnergia, FINEP and PETROBRAS. (author)

  19. Global exponential stability of cellular neural networks with continuously distributed delays and impulses

    International Nuclear Information System (INIS)

    Wang Yixuan; Xiong Wanmin; Zhou Qiyuan; Xiao Bing; Yu Yuehua

    2006-01-01

    In this Letter cellular neural networks with continuously distributed delays and impulses are considered. Sufficient conditions for the existence and global exponential stability of a unique equilibrium point are established by using the fixed point theorem and differential inequality techniques. The results of this Letter are new and they complement previously known results

  20. Model checking software for phylogenetic trees using distribution and database methods

    Directory of Open Access Journals (Sweden)

    Requeno José Ignacio

    2013-12-01

    Full Text Available Model checking, a generic and formal paradigm stemming from computer science based on temporal logics, has been proposed for the study of biological properties that emerge from the labeling of the states defined over the phylogenetic tree. This strategy allows us to use generic software tools already present in the industry. However, the performance of traditional model checking is penalized when scaling the system for large phylogenies. To this end, two strategies are presented here. The first one consists of partitioning the phylogenetic tree into a set of subgraphs each one representing a subproblem to be verified so as to speed up the computation time and distribute the memory consumption. The second strategy is based on uncoupling the information associated to each state of the phylogenetic tree (mainly, the DNA sequence and exporting it to an external tool for the management of large information systems. The integration of all these approaches outperforms the results of monolithic model checking and helps us to execute the verification of properties in a real phylogenetic tree.

  1. Energy Science and Technology Software Center

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, E.M.

    1995-03-01

    The Energy Science and Technology Software Center (ESTSC), is the U.S. Department of Energy`s (DOE) centralized software management facility. It is operated under contract for the DOE Office of Scientific and Technical Information (OSTI) and is located in Oak Ridge, Tennessee. The ESTSC is authorized by DOE and the U.S. Nuclear Regulatory Commission (NRC) to license and distribute DOE-and NRC-sponsored software developed by national laboratories and other facilities and by contractors of DOE and NRC. ESTSC also has selected software from the Nuclear Energy Agency (NEA) of the Organisation for Economic Cooperation and Development (OECD) through a software exchange agreement that DOE has with the agency.

  2. Boundary Spanning in Global Software Development

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie; Romani, Laurence

    imbalances of power, exacerbated in the case of an Indian vendor and a European client, need to be taken into account. The paper thus contributes with a more context sensitive understanding of inter-organizational boundary work. Taking the vendor perspective also leads to problematization of common...... of Indian IT vendor managers who are responsible for developing client relations and coordinating complex global development projects. The authors revise a framework of boundary spanning leadership practices to adapt it to an offshore outsourcing context. The empirical investigation highlights how...

  3. A Web-Based Learning System for Software Test Professionals

    Science.gov (United States)

    Wang, Minhong; Jia, Haiyang; Sugumaran, V.; Ran, Weijia; Liao, Jian

    2011-01-01

    Fierce competition, globalization, and technology innovation have forced software companies to search for new ways to improve competitive advantage. Web-based learning is increasingly being used by software companies as an emergent approach for enhancing the skills of knowledge workers. However, the current practice of Web-based learning is…

  4. A voting-based star identification algorithm utilizing local and global distribution

    Science.gov (United States)

    Fan, Qiaoyun; Zhong, Xuyang; Sun, Junhua

    2018-03-01

    A novel star identification algorithm based on voting scheme is presented in this paper. In the proposed algorithm, the global distribution and local distribution of sensor stars are fully utilized, and the stratified voting scheme is adopted to obtain the candidates for sensor stars. The database optimization is employed to reduce its memory requirement and improve the robustness of the proposed algorithm. The simulation shows that the proposed algorithm exhibits 99.81% identification rate with 2-pixel standard deviations of positional noises and 0.322-Mv magnitude noises. Compared with two similar algorithms, the proposed algorithm is more robust towards noise, and the average identification time and required memory is less. Furthermore, the real sky test shows that the proposed algorithm performs well on the real star images.

  5. A global analysis of recent experimental results: How well determined are the parton distribution functions?

    International Nuclear Information System (INIS)

    Morfin, J.G.

    1990-08-01

    Following is a brief summary of the results of an analysis of experimental data performed to extract the patron distribution functions. In contrast to other global analyses, this study investigated how the fit results depend on: Experimental Systematic Errors; Kinematic Cuts on the Analyzed Data and Choice of Initial Functional Forms, with a prime goal being a close look at the range of low-x behavior allowed by data. This is crucial for predictions for the SSC/LHC, HERA, and even at Tevatron Collider energies. Since all details can be found in the just released Fermilab preprint Parton Distributions from a Global QCD Analysis of Deep Inelastic Scattering and Lepton-Pair Production by J. G. M. and Wu-Ki Tung, this summary will be only a brief outline of major results. 11 refs., 13 figs

  6. A Reusable Software Architecture for Small Satellite AOCS Systems

    DEFF Research Database (Denmark)

    Alminde, Lars; Bendtsen, Jan Dimon; Laursen, Karl Kaas

    2006-01-01

    This paper concerns the software architecture called Sophy, which is an abbreviation for Simulation, Observation, and Planning in HYbrid systems. We present a framework that allows execution of hybrid dynamical systems in an on-line distributed computing environment, which includes interaction...... with both hardware and on-board software. Some of the key issues addressed by the framework are automatic translation of mathematical specifications of hybrid systems into executable software entities, management of execution of coupled models in a parallel distributed environment, as well as interaction...... with external components, hardware and/or software, through generic interfaces. Sophy is primarily intended as a tool for development of model based reusable software for the control and autonomous functions of satellites and/or satellite clusters....

  7. Towards reference architectures as an enabler for software ecosystems

    DEFF Research Database (Denmark)

    Knodel, Jens; Manikas, Konstantinos

    2016-01-01

    Software ecosystems - a topic with increasingly growing interest in academia and industry in the past decade - arguably revolutionized many aspects of industrial software engineering (business models, architectures, platforms, project executions, collaboration models, distribution of assets......, to name a few). Software ecosystems enable the contribution of external actors with distinct center a common technology and the potential distribution of the actor contributions to an existing user set. Reference architectures have been proven successful and beneficial for software product lines...... and traditional software development within distinct domains. They arguably come with a set of benefits that severely counterweights the additional effort of design and implementation. But what is the role of reference architectures in an ecosystem setting? In this position paper, we argue for the use...

  8. 76 FR 21033 - International Business Machines (IBM), Sales and Distribution Business Unit, Global Sales...

    Science.gov (United States)

    2011-04-14

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,364] International Business Machines (IBM), Sales and Distribution Business Unit, Global Sales Solution Department, Off-Site Teleworker in Centerport, New York; Notice of Affirmative Determination Regarding Application for Reconsideration By application dated November 29, 2011,...

  9. Producing and supporting sharable software

    International Nuclear Information System (INIS)

    Johnstad, H.; Nicholls, J.

    1987-02-01

    A survey is reported that addressed the question of shareable software for the High Energy Physics community. Statistics are compiled for the responses of 54 people attending a conference on the subject of shareable software to a questionnaire which addressed the usefulness of shareable software, preference of programming language, and source management tools. The results are found to reflect a continued need for shareable software in the High Energy Physics community and that this effort be performed in coordination. A strong mandate is also claimed for large facilities to support the community with software and that these facilities should act as distribution points. Considerable interest is expressed in languages other than FORTRAN, and the desire for standards or rules in programming is expressed. A need is identified for source management tools

  10. Raising Virtual Laboratories in Australia onto global platforms

    Science.gov (United States)

    Wyborn, L. A.; Barker, M.; Fraser, R.; Evans, B. J. K.; Moloney, G.; Proctor, R.; Moise, A. F.; Hamish, H.

    2016-12-01

    Across the globe, Virtual Laboratories (VLs), Science Gateways (SGs), and Virtual Research Environments (VREs) are being developed that enable users who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, etc. Outcomes range from enabling `long tail' researchers to more easily access specific data collections, to facilitating complex workflows on powerful supercomputers. In Australia, government funding has facilitated the development of a range of VLs through the National eResearch Collaborative Tools and Resources (NeCTAR) program. The VLs provide highly collaborative, research-domain oriented, integrated software infrastructures that meet user community needs. Twelve VLs have been funded since 2012, including the Virtual Geophysics Laboratory (VGL); Virtual Hazards, Impact and Risk Laboratory (VHIRL); Climate and Weather Science Laboratory (CWSLab); Marine Virtual Laboratory (MarVL); and Biodiversity and Climate Change Virtual Laboratory (BCCVL). These VLs share similar technical challenges, with common issues emerging on integration of tools, applications and access data collections via both cloud-based environments and other distributed resources. While each VL began with a focus on a specific research domain, communities of practice have now formed across the VLs around common issues, and facilitate identification of best practice case studies, and new standards. As a result, tools are now being shared where the VLs access data via data services using international standards such as ISO, OGC, W3C. The sharing of these approaches is starting to facilitate re-usability of infrastructure and is a step towards supporting interdisciplinary research. Whilst the focus of the VLs are Australia-centric, by using standards, these environments are able to be extended to analysis on other international datasets. Many VL datasets are subsets of global datasets and so extension to global is a

  11. Including Remote Participants and Artifacts: Visual, Audio, and Tactile Modalities in an Ethnographic Study of Globally Distributed Engineers

    DEFF Research Database (Denmark)

    Bjørn, Pernille; Pederson, Thomas

    2011-01-01

    In this paper we study how globally distributed Danish and Indian engineers co-construct and reconfigure a shared socio-technical collaborative place for global collaborative interaction: War Room meetings. We investigate the empirical case of War Room meetings based on three modalities in which ...

  12. Global direct pressures on biodiversity by large-scale metal mining: Spatial distribution and implications for conservation.

    Science.gov (United States)

    Murguía, Diego I; Bringezu, Stefan; Schaldach, Rüdiger

    2016-09-15

    Biodiversity loss is widely recognized as a serious global environmental change process. While large-scale metal mining activities do not belong to the top drivers of such change, these operations exert or may intensify pressures on biodiversity by adversely changing habitats, directly and indirectly, at local and regional scales. So far, analyses of global spatial dynamics of mining and its burden on biodiversity focused on the overlap between mines and protected areas or areas of high value for conservation. However, it is less clear how operating metal mines are globally exerting pressure on zones of different biodiversity richness; a similar gap exists for unmined but known mineral deposits. By using vascular plants' diversity as a proxy to quantify overall biodiversity, this study provides a first examination of the global spatial distribution of mines and deposits for five key metals across different biodiversity zones. The results indicate that mines and deposits are not randomly distributed, but concentrated within intermediate and high diversity zones, especially bauxite and silver. In contrast, iron, gold, and copper mines and deposits are closer to a more proportional distribution while showing a high concentration in the intermediate biodiversity zone. Considering the five metals together, 63% and 61% of available mines and deposits, respectively, are located in intermediate diversity zones, comprising 52% of the global land terrestrial surface. 23% of mines and 20% of ore deposits are located in areas of high plant diversity, covering 17% of the land. 13% of mines and 19% of deposits are in areas of low plant diversity, comprising 31% of the land surface. Thus, there seems to be potential for opening new mines in areas of low biodiversity in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Estimating the daily global solar radiation spatial distribution from diurnal temperature ranges over the Tibetan Plateau in China

    International Nuclear Information System (INIS)

    Pan, Tao; Wu, Shaohong; Dai, Erfu; Liu, Yujie

    2013-01-01

    Highlights: ► Bristow–Campbell model was calibrated and validated over the Tibetan Plateau. ► Develop a simple method to rasterise the daily global solar radiation and get gridded information. ► The daily global solar radiation spatial distribution over the Tibetan Plateau was estimated. - Abstract: Daily global solar radiation is fundamental to most ecological and biophysical processes because it plays a key role in the local and global energy budget. However, gridded information about the spatial distribution of solar radiation is limited. This study aims to parameterise the Bristow–Campbell model for the daily global solar radiation estimation in the Tibetan Plateau and propose a method to rasterise the daily global solar radiation. Observed daily solar radiation and diurnal temperature data from eleven stations over the Tibetan Plateau during 1971–2010 were used to calibrate and validate the Bristow–Campbell radiation model. The extra-terrestrial radiation and clear sky atmospheric transmittance were calculated on a Geographic Information System (GIS) platform. Results show that the Bristow–Campbell model performs well after adjusting the parameters, the average Pearson’s correlation coefficients (r), Nash–Sutcliffe equation (NSE), ratio of the root mean square error to the standard deviation of measured data (RSR), and root mean-square error (RMSE) of 11 stations are 0.85, 2.81 MJ m −2 day −1 , 0.3 and 0.77 respectively. Gridded maximum and minimum average temperature data were obtained using Parameter-elevation Regressions on Independent Slopes Model (PRISM) and validated by the Chinese Ecosystem Research Network (CERN) stations’ data. The spatial daily global solar radiation distribution pattern was estimated and analysed by combining the solar radiation model (Bristow–Campbell model) and meteorological interpolation model (PRISM). Based on the overall results, it can be concluded that a calibrated Bristow–Campbell performs well

  14. The optimisation of a water distribution system using Bentley WaterGEMS software

    Directory of Open Access Journals (Sweden)

    Świtnicka Karolina

    2017-01-01

    Full Text Available The proper maintenance of water distribution systems (WDSs requires from operators multiple actions in order to ensure optimal functioning. Usually, all requirements should be adjusted simultaneously. Therefore, the decision-making process is often supported by multi-criteria optimisation methods. Significant improvements of exploitation conditions of WDSs functioning can be achieved by connecting small water supply networks into group systems. Among many potential tools supporting advanced maintenance and management of WDSs, significant improvements have tools that can find the optimal solution by the implemented mechanism of metaheuristic methods, such as the genetic algorithm. In this paper, an exemplary WDS functioning optimisation is presented, in relevance to a group water supply system. The action range of optimised parameters included: maximisation of water flow velocity, regulation of pressure head, minimisation of water retention time in a network (water age and minimisation of pump energy consumption. All simulations were performed in Bentley WaterGEMS software.

  15. Global marine plankton functional type biomass distributions: Phaeocystis spp.

    Directory of Open Access Journals (Sweden)

    C. Widdicombe

    2012-09-01

    Full Text Available The planktonic haptophyte Phaeocystis has been suggested to play a fundamental role in the global biogeochemical cycling of carbon and sulphur, but little is known about its global biomass distribution. We have collected global microscopy data of the genus Phaeocystis and converted abundance data to carbon biomass using species-specific carbon conversion factors. Microscopic counts of single-celled and colonial Phaeocystis were obtained both through the mining of online databases and by accepting direct submissions (both published and unpublished from Phaeocystis specialists. We recorded abundance data from a total of 1595 depth-resolved stations sampled between 1955–2009. The quality-controlled dataset includes 5057 counts of individual Phaeocystis cells resolved to species level and information regarding life-stages from 3526 samples. 83% of stations were located in the Northern Hemisphere while 17% were located in the Southern Hemisphere. Most data were located in the latitude range of 50–70° N. While the seasonal distribution of Northern Hemisphere data was well-balanced, Southern Hemisphere data was biased towards summer months. Mean species- and form-specific cell diameters were determined from previously published studies. Cell diameters were used to calculate the cellular biovolume of Phaeocystis cells, assuming spherical geometry. Cell biomass was calculated using a carbon conversion factor for prymnesiophytes. For colonies, the number of cells per colony was derived from the colony volume. Cell numbers were then converted to carbon concentrations. An estimation of colonial mucus carbon was included a posteriori, assuming a mean colony size for each species. Carbon content per cell ranged from 9 pg C cell−1 (single-celled Phaeocystis antarctica to 29 pg C cell−1 (colonial Phaeocystis globosa. Non-zero Phaeocystis cell biomasses (without mucus carbon range from 2.9 × 10−5 to 5.4 × 103 μg C l−1, with a mean of 45.7 μg C

  16. 4th International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Calvo-Manzano, Jose

    2016-01-01

    This book contains a selection of papers from The 2015 International Conference on Software Process Improvement (CIMPS’15), held between the 28th and 30th of October in Mazatlán, Sinaloa, México. The CIMPS’15 is a global forum for researchers and practitioners that present and discuss the most recent innovations, trends, results, experiences and concerns in the several perspectives of Software Engineering with clear relationship but not limited to software processes, Security in Information and Communication Technology and Big Data Field. The main topics covered are: Organizational Models, Standards and Methodologies, Knowledge Management, Software Systems, Applications and Tools, Information and Communication Technologies and Processes in non-software domains (Mining, automotive, aerospace, business, health care, manufacturing, etc.) with a demonstrated relationship to software process challenges.

  17. On global stability criterion for neural networks with discrete and distributed delays

    International Nuclear Information System (INIS)

    Park, Ju H.

    2006-01-01

    Based on the Lyapunov functional stability analysis for differential equations and the linear matrix inequality (LMI) optimization approach, a new delay-dependent criterion for neural networks with discrete and distributed delays is derived to guarantee global asymptotic stability. The criterion is expressed in terms of LMIs, which can be solved easily by various convex optimization algorithms. Some numerical examples are given to show the effectiveness of proposed method

  18. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    Science.gov (United States)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the

  19. Optimization of traffic distribution control in software-configurable infrastructure of virtual data center based on a simulation model

    Directory of Open Access Journals (Sweden)

    I. P. Bolodurina

    2017-01-01

    Full Text Available Currently, the proportion of use of cloud computing technology in today's business processes of companies is growing steadily. Despite the fact that it allows you to reduce the cost of ownership and operation of IT infrastructure, there are a number of problems related to the control of data centers. One such problem is the efficiency of the use of available companies compute and network resources. One of the directions of optimization is the process of traffic control of cloud applications and services in data centers. Given the multi-tier architecture of modern data center, this problem does not quite trivial. The advantage of modern virtual infrastructure is the ability to use software-configurable networks and software-configurable data storages. However, existing solutions with algorithmic optimization does not take into account a number of features forming network traffic with multiple classes of applications. Within the framework of the exploration solved the problem of optimizing the distribution of traffic cloud applications and services for the software-controlled virtual data center infrastructure. A simulation model describing the traffic in data center and software-configurable network segments involved in the processing of user requests for applications and services located network environment that includes a heterogeneous cloud platform and software-configurable data storages. The developed model has allowed to implement cloud applications traffic management algorithm and optimize access to the storage system through the effective use of the channel for data transmission. In experimental studies found that the application of the developed algorithm can reduce the response time of cloud applications and services, and as a result improve the performance of processing user requests and to reduce the number of failures.

  20. A computer software system for the generation of global ocean tides including self-gravitation and crustal loading effects

    Science.gov (United States)

    Estes, R. H.

    1977-01-01

    A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables. Solutions for M2, S2, N2, K2, K1, O1, P1 tidal constituents neglecting the effects of ocean loading and self-gravitation and a converged M2, solution including ocean loading and self-gravitation effects are presented in the form of cotidal and corange maps.

  1. Architecture of a software quench management system

    International Nuclear Information System (INIS)

    Jerzy M. Nogiec et al.

    2001-01-01

    Testing superconducting accelerator magnets is inherently coupled with the proper handling of quenches; i.e., protecting the magnet and characterizing the quench process. Therefore, software implementations must include elements of both data acquisition and real-time controls. The architecture of the quench management software developed at Fermilab's Magnet Test Facility is described. This system consists of quench detection, quench protection, and quench characterization components that execute concurrently in a distributed system. Collaboration between the elements of quench detection, quench characterization and current control are discussed, together with a schema of distributed saving of various quench-related data. Solutions to synchronization and reliability in such a distributed quench system are also presented

  2. Advanced Transport Operating System (ATOPS) control display unit software description

    Science.gov (United States)

    Slominski, Christopher J.; Parks, Mark A.; Debure, Kelly R.; Heaphy, William J.

    1992-01-01

    The software created for the Control Display Units (CDUs), used for the Advanced Transport Operating Systems (ATOPS) project, on the Transport Systems Research Vehicle (TSRV) is described. Module descriptions are presented in a standardized format which contains module purpose, calling sequence, a detailed description, and global references. The global reference section includes subroutines, functions, and common variables referenced by a particular module. The CDUs, one for the pilot and one for the copilot, are used for flight management purposes. Operations performed with the CDU affects the aircraft's guidance, navigation, and display software.

  3. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  4. Large-scale distribution patterns of mangrove nematodes: A global meta-analysis.

    Science.gov (United States)

    Brustolin, Marco C; Nagelkerken, Ivan; Fonseca, Gustavo

    2018-05-01

    Mangroves harbor diverse invertebrate communities, suggesting that macroecological distribution patterns of habitat-forming foundation species drive the associated faunal distribution. Whether these are driven by mangrove biogeography is still ambiguous. For small-bodied taxa, local factors and landscape metrics might be as important as macroecology. We performed a meta-analysis to address the following questions: (1) can richness of mangrove trees explain macroecological patterns of nematode richness? and (2) do local landscape attributes have equal or higher importance than biogeography in structuring nematode richness? Mangrove areas of Caribbean-Southwest Atlantic, Western Indian, Central Indo-Pacific, and Southwest Pacific biogeographic regions. We used random-effects meta-analyses based on natural logarithm of the response ratio (lnRR) to assess the importance of macroecology (i.e., biogeographic regions, latitude, longitude), local factors (i.e., aboveground mangrove biomass and tree richness), and landscape metrics (forest area and shape) in structuring nematode richness from 34 mangroves sites around the world. Latitude, mangrove forest area, and forest shape index explained 19% of the heterogeneity across studies. Richness was higher at low latitudes, closer to the equator. At local scales, richness increased slightly with landscape complexity and decreased with forest shape index. Our results contrast with biogeographic diversity patterns of mangrove-associated taxa. Global-scale nematode diversity may have evolved independently of mangrove tree richness, and diversity of small-bodied metazoans is probably more closely driven by latitude and associated climates, rather than local, landscape, or global biogeographic patterns.

  5. Software Reuse Within the Earth Science Community

    Science.gov (United States)

    Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.

    2006-01-01

    Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very

  6. Classification and global distribution of ocean precipitation types based on satellite passive microwave signatures

    Science.gov (United States)

    Gautam, Nitin

    The main objectives of this thesis are to develop a robust statistical method for the classification of ocean precipitation based on physical properties to which the SSM/I is sensitive and to examine how these properties vary globally and seasonally. A two step approach is adopted for the classification of oceanic precipitation classes from multispectral SSM/I data: (1)we subjectively define precipitation classes using a priori information about the precipitating system and its possible distinct signature on SSM/I data such as scattering by ice particles aloft in the precipitating cloud, emission by liquid rain water below freezing level, the difference of polarization at 19 GHz-an indirect measure of optical depth, etc.; (2)we then develop an objective classification scheme which is found to reproduce the subjective classification with high accuracy. This hybrid strategy allows us to use the characteristics of the data to define and encode classes and helps retain the physical interpretation of classes. The classification methods based on k-nearest neighbor and neural network are developed to objectively classify six precipitation classes. It is found that the classification method based neural network yields high accuracy for all precipitation classes. An inversion method based on minimum variance approach was used to retrieve gross microphysical properties of these precipitation classes such as column integrated liquid water path, column integrated ice water path, and column integrated min water path. This classification method is then applied to 2 years (1991-92) of SSM/I data to examine and document the seasonal and global distribution of precipitation frequency corresponding to each of these objectively defined six classes. The characteristics of the distribution are found to be consistent with assumptions used in defining these six precipitation classes and also with well known climatological patterns of precipitation regions. The seasonal and global

  7. Development of E-learning Software Based Multiplatform Components

    OpenAIRE

    Salamah, Irma; Ganiardi, M. Aris

    2017-01-01

    E-learning software is a product of information and communication technology used to help dynamic and flexible learning process between teacher and student. The software technology was first used in the development of e-learning software in the form of web applications. The advantages of this technology because of the ease in the development, installation, and distribution of data. Along with advances in mobile/wireless electronics technology, e-learning software is adapted to this technology...

  8. ETICS the international software engineering service for the grid

    CERN Document Server

    Di Meglio, A; Couvares, P; Ronchieri, E; Takács, E

    2008-01-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects ...

  9. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  10. MicROS-drt: supporting real-time and scalable data distribution in distributed robotic systems.

    Science.gov (United States)

    Ding, Bo; Wang, Huaimin; Fan, Zedong; Zhang, Pengfei; Liu, Hui

    A primary requirement in distributed robotic software systems is the dissemination of data to all interested collaborative entities in a timely and scalable manner. However, providing such a service in a highly dynamic and resource-limited robotic environment is a challenging task, and existing robot software infrastructure has limitations in this aspect. This paper presents a novel robot software infrastructure, micROS-drt, which supports real-time and scalable data distribution. The solution is based on a loosely coupled data publish-subscribe model with the ability to support various time-related constraints. And to realize this model, a mature data distribution standard, the data distribution service for real-time systems (DDS), is adopted as the foundation of the transport layer of this software infrastructure. By elaborately adapting and encapsulating the capability of the underlying DDS middleware, micROS-drt can meet the requirement of real-time and scalable data distribution in distributed robotic systems. Evaluation results in terms of scalability, latency jitter and transport priority as well as the experiment on real robots validate the effectiveness of this work.

  11. Implementing the Gaia Astrometric Global Iterative Solution (AGIS) in Java

    Science.gov (United States)

    O'Mullane, William; Lammers, Uwe; Lindegren, Lennart; Hernandez, Jose; Hobbs, David

    2011-10-01

    This paper provides a description of the Java software framework which has been constructed to run the Astrometric Global Iterative Solution for the Gaia mission. This is the mathematical framework to provide the rigid reference frame for Gaia observations from the Gaia data itself. This process makes Gaia a self calibrated, and input catalogue independent, mission. The framework is highly distributed typically running on a cluster of machines with a database back end. All code is written in the Java language. We describe the overall architecture and some of the details of the implementation.

  12. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  13. Software error masking effect on hardware faults

    International Nuclear Information System (INIS)

    Choi, Jong Gyun; Seong, Poong Hyun

    1999-01-01

    Based on the Very High Speed Integrated Circuit (VHSIC) Hardware Description Language (VHDL), in this work, a simulation model for fault injection is developed to estimate the dependability of the digital system in operational phase. We investigated the software masking effect on hardware faults through the single bit-flip and stuck-at-x fault injection into the internal registers of the processor and memory cells. The fault location reaches all registers and memory cells. Fault distribution over locations is randomly chosen based on a uniform probability distribution. Using this model, we have predicted the reliability and masking effect of an application software in a digital system-Interposing Logic System (ILS) in a nuclear power plant. We have considered four the software operational profiles. From the results it was found that the software masking effect on hardware faults should be properly considered for predicting the system dependability accurately in operation phase. It is because the masking effect was formed to have different values according to the operational profile

  14. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  15. Software errors and complexity: An empirical investigation

    Science.gov (United States)

    Basili, Victor R.; Perricone, Berry T.

    1983-01-01

    The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.

  16. Assessing water resources in Azerbaijan using a local distributed model forced and constrained with global data

    Science.gov (United States)

    Bouaziz, Laurène; Hegnauer, Mark; Schellekens, Jaap; Sperna Weiland, Frederiek; ten Velden, Corine

    2017-04-01

    In many countries, data is scarce, incomplete and often not easily shared. In these cases, global satellite and reanalysis data provide an alternative to assess water resources. To assess water resources in Azerbaijan, a completely distributed and physically based hydrological wflow-sbm model was set-up for the entire Kura basin. We used SRTM elevation data, a locally available river map and one from OpenStreetMap to derive the drainage direction network at the model resolution of approximately 1x1 km. OpenStreetMap data was also used to derive the fraction of paved area per cell to account for the reduced infiltration capacity (c.f. Schellekens et al. 2014). We used the results of a global study to derive root zone capacity based on climate data (Wang-Erlandsson et al., 2016). To account for the variation in vegetation cover over the year, monthly averages of Leaf Area Index, based on MODIS data, were used. For the soil-related parameters, we used global estimates as provided by Dai et al. (2013). This enabled the rapid derivation of a first estimate of parameter values for our hydrological model. Digitized local meteorological observations were scarce and available only for limited time period. Therefore several sources of global meteorological data were evaluated: (1) EU-WATCH global precipitation, temperature and derived potential evaporation for the period 1958-2001 (Harding et al., 2011), (2) WFDEI precipitation, temperature and derived potential evaporation for the period 1979-2014 (by Weedon et al., 2014), (3) MSWEP precipitation (Beck et al., 2016) and (4) local precipitation data from more than 200 stations in the Kura basin were available from the NOAA website for a period up to 1991. The latter, together with data archives from Azerbaijan, were used as a benchmark to evaluate the global precipitation datasets for the overlapping period 1958-1991. By comparing the datasets, we found that monthly mean precipitation of EU-WATCH and WFDEI coincided well

  17. Network-based analysis of software change propagation.

    Science.gov (United States)

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  18. Unbiased global determination of parton distributions and their uncertainties at NNLO and at LO

    NARCIS (Netherlands)

    Collaboration, The NNPDF; Ball, Richard D.; Bertone, Valerio; Cerutti, Francesco; Debbio, Luigi Del; Forte, Stefano; Guffanti, Alberto; Latorre, Jose I.; Rojo, Juan; Ubiali, Maria

    2012-01-01

    We present a determination of the parton distributions of the nucleon from a global set of hard scattering data using the NNPDF methodology at LO and NNLO in perturbative QCD, thereby generalizing to these orders the NNPDF2.1 NLO parton set. Heavy quark masses are included using the so-called FONLL

  19. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  20. Distributed embedded controller development with petri nets application to globally-asynchronous locally-synchronous systems

    CERN Document Server

    Moutinho, Filipe de Carvalho

    2016-01-01

    This book describes a model-based development approach for globally-asynchronous locally-synchronous distributed embedded controllers.  This approach uses Petri nets as modeling formalism to create platform and network independent models supporting the use of design automation tools.  To support this development approach, the Petri nets class in use is extended with time-domains and asynchronous-channels. The authors’ approach uses models not only providing a better understanding of the distributed controller and improving the communication among the stakeholders, but also to be ready to support the entire lifecycle, including the simulation, the verification (using model-checking tools), the implementation (relying on automatic code generators), and the deployment of the distributed controller into specific platforms. Uses a graphical and intuitive modeling formalism supported by design automation tools; Enables verification, ensuring that the distributed controller was correctly specified; Provides flex...

  1. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  2. A software for parameter optimization with Differential Evolution Entirely Parallel method

    Directory of Open Access Journals (Sweden)

    Konstantin Kozlov

    2016-08-01

    Full Text Available Summary. Differential Evolution Entirely Parallel (DEEP package is a software for finding unknown real and integer parameters in dynamical models of biological processes by minimizing one or even several objective functions that measure the deviation of model solution from data. Numerical solutions provided by the most efficient global optimization methods are often problem-specific and cannot be easily adapted to other tasks. In contrast, DEEP allows a user to describe both mathematical model and objective function in any programming language, such as R, Octave or Python and others. Being implemented in C, DEEP demonstrates as good performance as the top three methods from CEC-2014 (Competition on evolutionary computation benchmark and was successfully applied to several biological problems. Availability. DEEP method is an open source and free software distributed under the terms of GPL licence version 3. The sources are available at http://deepmethod.sourceforge.net/ and binary packages for Fedora GNU/Linux are provided for RPM package manager at https://build.opensuse.org/project/repositories/home:mackoel:compbio.

  3. Software Development and Testing Approach and Challenges in a distributed HEP Collaboration

    CERN Document Server

    Burckhart-Chromek, Doris

    2007-01-01

    In developing the ATLAS [1] Trigger and Data Acquisition (TDAQ) software, the team is applying the iterative waterfall model, evolutionary process management, formal software inspection, and lightweight review techniques. The long preparation phase, with a geographically widespread development team required that the standard techniques be adapted to this HEP environment. The testing process is receiving special attention. Unit tests and check targets in nightly project builds form the basis for the subsequent software project release testing. The integrated software is then being run on computing farms that give further opportunites for gaining experience, fault finding, and acquiring ideas for improvement. Dedicated tests on a farm of up to 1000 nodes address the large-scale aspect of the project. Integration test activities on the experimental site include the special purpose-built event readout hardware. Deployment in detector commissioning starts the countdown towards running the final ATLAS experiment. T...

  4. Global and Seasonal Distributions of CHOCHO and HCHO Observed by the Ozone Monitoring Instrument on EOS Aura

    Science.gov (United States)

    Kurosu, T. P.; Fu, T.; Volkamer, R.; Millet, D. B.; Chance, K.

    2006-12-01

    Over the two years since its launch in July 2004, the Ozone Monitoring Instrument (OMI) on EOS Aura has demonstrated the capability to routinely monitor the volatile organic compounds (VOCs) formaldehyde (HCHO) and glyoxal (CHOCHO). OMI's daily global coverage and spatial resolution as high as 13x24 km provides a unique data set of these molecules for the study of air quality from space. We present the first study of global seasonal distributions of CHOCHO from space, derived from a year of OMI observations. CHOCHO distributions are compared to simultaneous retrievals of HCHO from OMI, providing a first indication of seasonally resolved ratios of these VOCs on a global scale. Satellite retrievals are compared to global simulations of HCHO and CHOCHO, based on current knowledge of sources and sinks, using the GEOS-Chem global chemistry and transport model. Formaldehyde is both directly emitted and also produced from the oxidation of many VOCs, notably biogenic isoprene, and is removed by photolysis and oxidation. Precursors of glyoxal include isoprene, monoterpenes, and aromatics from anthropogenic, biogenic, and biomass burning emissions; it is removed by photolysis, oxidation by OH, dry/wet deposition, and aerosol uptake. As a case study, satellite observations will also be compared to ground-based measurements taken during the Pearl River Delta 2006 field campaign near Guangzhou, China, where high glyoxal concentrations are frequently observed from space.

  5. A contribution to the test software for the VXI electronic cards of the Eurogam multidetector in a Unix/VXWorks distributed environment

    International Nuclear Information System (INIS)

    Kadionik, P.

    1992-01-01

    The Eurogam gamma ray multidetector involves, in a first phase, 45 hyper pure Ge detectors, each surrounded by an Anti Compton shield of 10 BGO detectors. In order to ensure the highest reliability and an easy upgrade of the array, the electronic cards have been designed in the new VXI (VME Bus Extension to Instrumentation) standard; this allows to drive the 495 detectors with 4300 parameters to be adjusted by software. The data acquisition architecture is distributed on an Ethernet network. The software for set up and tests of the VXI cards have been written in C, it uses a real time kernel (VxWorks from Wind River Systems) interfaced to the Sun Unix environment. The inter-tasks communications use the Remote Procedure Calls protocol. The inner-shell of the software is connected to a data base and to a graphic interface which allows the engineers or physicists to have a very easy set-up for so many parameters to adjust

  6. Achieving Better Buying Power for Mobile Open Architecture Software Systems Through Diverse Acquisition Scenarios

    Science.gov (United States)

    2016-04-30

    largest acquirers of commodity and bespoke (custom) software systems. The Defense community further extends its reach and influence on a global basis...information system applications that support modern military operations at a regional, national, or global level. These applications may be focused to...alternatives for costing or charging for software that include franchising ; enterprise licensing; metered usage; advertising supported; subscription

  7. Conceptual design for controller software of mechatronic systems

    NARCIS (Netherlands)

    Broenink, Johannes F.; Hilderink, G.H.; Bakkers, André; Bradshaw, Alan; Counsell, John

    1998-01-01

    The method and software tool presented here, aims at supporting the development of control software for mechatronic systems. Heterogeneous distributed embedded processors are considered as target hardware. Principles of the method are that the implementation process is a stepwise refinement from

  8. Co-sourcing in software development offshoring

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2013-01-01

    Software development projects are increasingly geographical distributed with offshoring, which introduce complex risks that can lead to project failure. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how co......-sourcing shapes the perception and alleviation of common offshoring risks is limited. We present a case study of how a certified CMMI-level 5 Danish software supplier approaches these risks in offshore co-sourcing. The paper explains how common offshoring risks are perceived and alleviated when adopting the co...

  9. Global assessment of human losses due to earthquakes

    Science.gov (United States)

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  10. Software To Secure Distributed Propulsion Simulations

    Science.gov (United States)

    Blaser, Tammy M.

    2003-01-01

    Distributed-object computing systems are presented with many security threats, including network eavesdropping, message tampering, and communications middleware masquerading. NASA Glenn Research Center, and its industry partners, has taken an active role in mitigating the security threats associated with developing and operating their proprietary aerospace propulsion simulations. In particular, they are developing a collaborative Common Object Request Broker Architecture (CORBA) Security (CORBASec) test bed to secure their distributed aerospace propulsion simulations. Glenn has been working with its aerospace propulsion industry partners to deploy the Numerical Propulsion System Simulation (NPSS) object-based technology. NPSS is a program focused on reducing the cost and time in developing aerospace propulsion engines

  11. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  12. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  13. Scalable and fail-safe deployment of the ATLAS Distributed Data Management system Rucio

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Barisits, Martin-Stefan; Beermann, Thomas Alfons; Serfon, Cedric; Garonne, Vincent

    2015-01-01

    This contribution details the deployment of Rucio, the ATLAS Distributed Data Management system. The main complication is that Rucio interacts with a wide variety of external services, and connects globally distributed data centres under different technological and administrative control, at an unprecedented data volume. It is therefore not possibly to create a duplicate instance of Rucio for testing or integration. Every software upgrade or configuration change is thus potentially disruptive and requires fail-safe software and automatic error recovery. Rucio uses a three-layer scaling and mitigation strategy based on quasi-realtime monitoring. This strategy mainly employs independent stateless services, automatic failover, and service migration. The technologies used for deployment and mitigation include OpenStack, Puppet, Graphite, HAProxy, Apache, and nginx. In this contribution, the reasons and design decisions for the deployment, the actual implementation, and an evaluation of all involved services and c...

  14. 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    Studies in Computational Intelligence : Volume 492

    2013-01-01

    This edited book presents scientific results of the 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2013), held in Honolulu, Hawaii, USA on July 1-3, 2013. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 17 outstanding papers from those papers accepted for presentation at the conference.  

  15. 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    2015-01-01

    This edited book presents scientific results of 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2014) held on June 30 – July 2, 2014 in Las Vegas Nevada, USA. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 13 outstanding papers from those papers accepted for presentation at the conference.

  16. Cross-border software development of health information system: A case study on project between India and Pakistan based on open source software

    OpenAIRE

    Sabir, Uzma

    2017-01-01

    Global software development is a phenomenon that is receiving considerable interest from researchers during past two decades. Several challenges have been identified and approaches to deal with these challenges have been developed. Typically, western companies outsource their projects to countries where costs are lower and skilled professionals are easily available. Majority of these projects are developed for commercial purposes. However, software development projects between India and Pakis...

  17. Design and implement of BESIII online histogramming software

    International Nuclear Information System (INIS)

    Li Fei; Wang Liang; Liu Yingjie; Chinese Academy of Sciences, Beijing; Zhu Kejun; Zhao Jingwei

    2007-01-01

    The online histogramming software is an important part of the BESIII DAQ (Data Acquisition) system. This article introduces the main requirements and design of the online histogramming software and presents how to produce, transmit and gather histograms in the distributed environment in the current software implement. The article also illustrate one smart, simple and easy to expand way of setup with xml configure database. (authors)

  18. Global Ionosphere Mapping and Differential Code Bias Estimation during Low and High Solar Activity Periods with GIMAS Software

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2018-05-01

    Full Text Available Ionosphere research using the Global Navigation Satellite Systems (GNSS techniques is a hot topic, with their unprecedented high temporal and spatial sampling rate. We introduced a new GNSS Ionosphere Monitoring and Analysis Software (GIMAS in order to model the global ionosphere vertical total electron content (VTEC maps and to estimate the GPS and GLObalnaya NAvigatsionnaya Sputnikovaya Sistema (GLONASS satellite and receiver differential code biases (DCBs. The GIMAS-based Global Ionosphere Map (GIM products during low (day of year from 202 to 231, in 2008 and high (day of year from 050 to 079, in 2014 solar activity periods were investigated and assessed. The results showed that the biases of the GIMAS-based VTEC maps relative to the International GNSS Service (IGS Ionosphere Associate Analysis Centers (IAACs VTEC maps ranged from −3.0 to 1.0 TECU (TEC unit (1 TECU = 1 × 1016 electrons/m2. The standard deviations (STDs ranged from 0.7 to 1.9 TECU in 2008, and from 2.0 to 8.0 TECU in 2014. The STDs at a low latitude were significantly larger than those at middle and high latitudes, as a result of the ionospheric latitudinal gradients. When compared with the Jason-2 VTEC measurements, the GIMAS-based VTEC maps showed a negative systematic bias of about −1.8 TECU in 2008, and a positive systematic bias of about +2.2 TECU in 2014. The STDs were about 2.0 TECU in 2008, and ranged from 2.2 to 8.5 TECU in 2014. Furthermore, the aforementioned characteristics were strongly related to the conditions of the ionosphere variation and the geographic latitude. The GPS and GLONASS satellite and receiver P1-P2 DCBs were compared with the IAACs DCBs. The root mean squares (RMSs were 0.16–0.20 ns in 2008 and 0.13–0.25 ns in 2014 for the GPS satellites and 0.26–0.31 ns in 2014 for the GLONASS satellites. The RMSs of receiver DCBs were 0.21–0.42 ns in 2008 and 0.33–1.47 ns in 2014 for GPS and 0.67–0.96 ns in 2014 for GLONASS. The monthly

  19. Social Software: A Powerful Paradigm for Building Technology for Global Learning

    Science.gov (United States)

    Wooding, Amy; Wooding, Kjell

    2018-01-01

    It is not difficult to imagine a world where internet-connected mobile devices are accessible to everyone. Can these technologies be used to help solve the challenges of global education? This was the challenge posed by the Global Learning XPRIZE--a $15 million grand challenge competition aimed at addressing this global teaching shortfall. In…

  20. Global asymptotic stability analysis of bidirectional associative memory neural networks with distributed delays and impulse

    International Nuclear Information System (INIS)

    Huang Zaitang; Luo Xiaoshu; Yang Qigui

    2007-01-01

    Many systems existing in physics, chemistry, biology, engineering and information science can be characterized by impulsive dynamics caused by abrupt jumps at certain instants during the process. These complex dynamical behaviors can be model by impulsive differential system or impulsive neural networks. This paper formulates and studies a new model of impulsive bidirectional associative memory (BAM) networks with finite distributed delays. Several fundamental issues, such as global asymptotic stability and existence and uniqueness of such BAM neural networks with impulse and distributed delays, are established

  1. Co-sourcing in software development offshoring

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Persson, John Stouby

    2013-01-01

    Software development projects are increasingly geographical distributed with offshoring, which introduce complex risks that can lead to project failure. Co-sourcing is a highly integrative and cohesive approach, seen successful, to software development offshoring. However, research of how co-sour......-taking by high attention to of the closely interrelated structure and technology components in terms of CMMI and the actors’ cohesion and integration in terms of Scrum....

  2. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  3. Beyond Open Source: Evaluating the Community Availability of Software

    Directory of Open Access Journals (Sweden)

    Bret Davidson

    2016-01-01

    Full Text Available The Code4Lib community has produced an increasingly impressive collection of open source software over the last decade, but much of this creative work remains out of reach for large portions of the library community. Do the relatively privileged institutions represented by a majority of Code4Lib participants have a professional responsibility to support the adoption of their innovations? Drawing from old and new software packaging and distribution approaches (from freeware to Docker, we propose extending the open source software values of collaboration and transparency to include the wide and affordable distribution of software. We believe this will not only simplify the process of sharing our applications within the library community, but also make it possible for less well-resourced institutions to actually use our software. We identify areas of need, present our experiences with the users of our own open source projects, discuss our attempts to go beyond open source, propose a preliminary set of technology availability performance indicators for evaluating software availability, and make an argument for the internal value of supporting and encouraging a vibrant library software ecosystem.

  4. New results on global exponential dissipativity analysis of memristive inertial neural networks with distributed time-varying delays.

    Science.gov (United States)

    Zhang, Guodong; Zeng, Zhigang; Hu, Junhao

    2018-01-01

    This paper is concerned with the global exponential dissipativity of memristive inertial neural networks with discrete and distributed time-varying delays. By constructing appropriate Lyapunov-Krasovskii functionals, some new sufficient conditions ensuring global exponential dissipativity of memristive inertial neural networks are derived. Moreover, the globally exponential attractive sets and positive invariant sets are also presented here. In addition, the new proposed results here complement and extend the earlier publications on conventional or memristive neural network dynamical systems. Finally, numerical simulations are given to illustrate the effectiveness of obtained results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Temperature drives global patterns in forest biomass distribution in leaves, stems, and roots.

    Science.gov (United States)

    Reich, Peter B; Luo, Yunjian; Bradford, John B; Poorter, Hendrik; Perry, Charles H; Oleksyn, Jacek

    2014-09-23

    Whether the fraction of total forest biomass distributed in roots, stems, or leaves varies systematically across geographic gradients remains unknown despite its importance for understanding forest ecology and modeling global carbon cycles. It has been hypothesized that plants should maintain proportionally more biomass in the organ that acquires the most limiting resource. Accordingly, we hypothesize greater biomass distribution in roots and less in stems and foliage in increasingly arid climates and in colder environments at high latitudes. Such a strategy would increase uptake of soil water in dry conditions and of soil nutrients in cold soils, where they are at low supply and are less mobile. We use a large global biomass dataset (>6,200 forests from 61 countries, across a 40 °C gradient in mean annual temperature) to address these questions. Climate metrics involving temperature were better predictors of biomass partitioning than those involving moisture availability, because, surprisingly, fractional distribution of biomass to roots or foliage was unrelated to aridity. In contrast, in increasingly cold climates, the proportion of total forest biomass in roots was greater and in foliage was smaller for both angiosperm and gymnosperm forests. These findings support hypotheses about adaptive strategies of forest trees to temperature and provide biogeographically explicit relationships to improve ecosystem and earth system models. They also will allow, for the first time to our knowledge, representations of root carbon pools that consider biogeographic differences, which are useful for quantifying whole-ecosystem carbon stocks and cycles and for assessing the impact of climate change on forest carbon dynamics.

  6. A controlled experiment on the impact of software structure on maintainability

    Science.gov (United States)

    Rombach, Dieter H.

    1987-01-01

    The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.

  7. EDUCATIONAL SOFTWARE PROMOTION AND DISTRIBUTIONON THE UKRAINIAN MARKET

    Directory of Open Access Journals (Sweden)

    Y.B. Samchinska

    2013-03-01

    Full Text Available The article considers the legislative requirements and features of distribution of the pedagogical software, a condition of the market of this production in Ukraine; the main actions for sales promotion and the advertising, optimization of marketing communications for the pedagogical software developers.

  8. Software for modelling groundwater transport and contaminant migration

    International Nuclear Information System (INIS)

    Gishkelyuk, I.A.

    2008-01-01

    Facilities of modern software for modeling of groundwater transport and process of contaminant distribution are considered. Advantages of their application are discussed. The comparative analysis of mathematical modeling software of 'Groundwater modeling system' and 'Earth Science Module' from 'COMSOL Multiphysics' is carried out. (authors)

  9. Globally reasoning about localised security policies in distributed systems

    DEFF Research Database (Denmark)

    Hernandez, Alejandro Mario

    In this report, we aim at establishing proper ways for model checking the global security of distributed systems, which are designed consisting of set of localised security policies that enforce specific issues about the security expected. The systems are formally specified following a syntax......, defined in detail in this report, and their behaviour is clearly established by the Semantics, also defined in detail in this report. The systems include the formal attachment of security policies into their locations, whose intended interactions are trapped by the policies, aiming at taking access...... control decisions of the system, and the Semantics also takes care of this. Using the Semantics, a Labelled Transition System (LTS) can be induced for every particular system, and over this LTS some model checking tasks could be done. We identify how this LTS is indeed obtained, and propose an alternative...

  10. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  11. ALMA software architecture

    Science.gov (United States)

    Schwarz, Joseph; Raffi, Gianni

    2002-12-01

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe and North America. ALMA will consist of at least 64 12-meter antennas operating in the millimeter and sub-millimeter range. It will be located at an altitude of about 5000m in the Chilean Atacama desert. The primary challenge to the development of the software architecture is the fact that both its development and runtime environments will be distributed. Groups at different institutes will develop the key elements such as Proposal Preparation tools, Instrument operation, On-line calibration and reduction, and Archiving. The Proposal Preparation software will be used primarily at scientists' home institutions (or on their laptops), while Instrument Operations will execute on a set of networked computers at the ALMA Operations Support Facility. The ALMA Science Archive, itself to be replicated at several sites, will serve astronomers worldwide. Building upon the existing ALMA Common Software (ACS), the system architects will prepare a robust framework that will use XML-encoded entity objects to provide an effective solution to the persistence needs of this system, while remaining largely independent of any underlying DBMS technology. Independence of distributed subsystems will be facilitated by an XML- and CORBA-based pass-by-value mechanism for exchange of objects. Proof of concept (as well as a guide to subsystem developers) will come from a prototype whose details will be presented.

  12. A global probe into dental student perceptions about philanthropy, global dentistry and international student exchanges.

    Science.gov (United States)

    Ivanoff, Chris S; Yaneva, Krassimira; Luan, Diana; Andonov, Bogomil; Kumar, Reena R; Agnihotry, Anirudha; Ivanoff, Athena E; Emmanouil, Dimitrios; Volpato, Luiz Evaristo Ricci; Koneski, Filip; Muratovska, Ilijana; Al-Shehri, Huda A; Al-Taweel, Sara M; Daly, Michele

    2017-04-01

    Training culturally competent graduates who can practice effectively in a multicultural environment is a goal of contemporary dental education. The Global Oral Health Initiative is a network of dental schools seeking to promote global dentistry as a component of cultural competency training. Before initiating international student exchanges, a survey was conducted to assess students' awareness of global dentistry and interest in cross-national clerkships. A 22-question, YES/NO survey was distributed to 3,487 dental students at eight schools in seven countries. The questions probed students about their school's commitment to enhance their education by promoting global dentistry, volunteerism and philanthropy. The data were analysed using Vassarstats statistical software. In total, 2,371 students (67.9%) completed the survey. Cultural diversity was seen as an important component of dental education by 72.8% of the students, with two-thirds (66.9%) acknowledging that their training provided preparation for understanding the oral health care needs of disparate peoples. A high proportion (87.9%) agreed that volunteerism and philanthropy are important qualities of a well-rounded dentist, but only about one-third felt that their school supported these behaviours (36.2%) or demonstrated a commitment to promote global dentistry (35.5%). In addition, 87.4% felt that dental schools are morally bound to improve oral health care in marginalised global communities and should provide students with international exchange missions (91%), which would enhance their cultural competency (88.9%) and encourage their participation in charitable missions after graduation (67.6%). The study suggests that dental students would value international exchanges, which may enhance students' knowledge and self-awareness related to cultural competence. © 2016 FDI World Dental Federation.

  13. Global distribution of Chelonid fibropapilloma-associated herpesvirus among clinically healthy sea turtles

    DEFF Research Database (Denmark)

    Alfaro Nuñez, Luis Alonso; Bertelsen, Mads Frost; Bojesen, Anders Miki

    2014-01-01

    BackgroundFibropapillomatosis (FP) is a neoplastic disease characterized by cutaneous tumours that has been documented to infect all sea turtle species. Chelonid fibropapilloma-associated herpesvirus (CFPHV) is believed to be the aetiological agent of FP, based principally on consistent PCR......-based detection of herpesvirus DNA sequences from FP tumours. We used a recently described PCR-based assay that targets 3 conserved CFPHV genes, to survey 208 green turtles (Chelonia mydas). This included both FP tumour exhibiting and clinically healthy individuals. An additional 129 globally distributed...

  14. Improving Data Catalogs with Free and Open Source Software

    Science.gov (United States)

    Schweitzer, R.; Hankin, S.; O'Brien, K.

    2013-12-01

    The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are

  15. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  16. Managing globally distributed expertise with new competence management solutions a big-science collaboration as a pilot case.

    CERN Document Server

    Ferguson, J; Livan, M; Nordberg, M; Salmia, T; Vuola, O

    2003-01-01

    In today's global organisations and networks, a critical factor for effective innovation and project execution is appropriate competence and skills management. The challenges include selection of strategic competences, competence development, and leveraging the competences and skills to drive innovation and collaboration for shared goals. This paper presents a new industrial web-enabled competence management and networking solution and its implementation and piloting in a complex big-science environment of globally distributed competences.

  17. Estimates of global, regional, and national incidence, prevalence, and mortality of HIV, 1980–2015: the Global Burden of Disease Study 2015

    DEFF Research Database (Denmark)

    Moesgaard Iburg, Kim

    2016-01-01

    and sex on initial CD4 distribution at infection, CD4 progression rates (probability of progression from higher to lower CD4 cell-count category), on and off antiretroviral therapy (ART) mortality, and mortality from all other causes. Our estimation strategy links the GBD 2015 assessment of all......Summary Background Timely assessment of the burden of HIV/AIDS is essential for policy setting and programme evaluation. In this report from the Global Burden of Disease Study 2015 (GBD 2015), we provide national estimates of levels and trends of HIV/AIDS incidence, prevalence, coverage......-cause mortality and estimation of incidence and prevalence so that for each draw from the uncertainty distribution all assumptions used in each step are internally consistent. We estimated incidence, prevalence, and death with GBD versions of the Estimation and Projection Package (EPP) and Spectrum software...

  18. I-Structure software cache for distributed applications

    Directory of Open Access Journals (Sweden)

    Alfredo Cristóbal Salas

    2004-01-01

    Full Text Available En este artículo, describimos el caché de software I-Structure para entornos de memoria distribuida (D-ISSC, lo cual toma ventaja de la localidad de los datos mientras mantiene la capacidad de tolerancia a la latencia de sistemas de memoria I-Structure. Las facilidades de programación de los programas MPI, le ocultan los problemas de sincronización al programador. Nuestra evaluación experimental usando un conjunto de pruebas de rendimiento indica que clusters de PC con I-Structure y su mecanismo de cache D-ISSC son más robustos. El sistema puede acelerar aplicaciones de comunicación intensiva regulares e irregulares.

  19. Statistical approach to software reliability certification

    NARCIS (Netherlands)

    Corro Ramos, I.; Di Bucchianico, A.; Hee, van K.M.

    2009-01-01

    We present a sequential software release procedure that certifies with some confidence level that the next error is not occurring within a certain time interval. Our procedure is defined in such a way that the release time is optimal for single stages and the global risk can be controlled. We assume

  20. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  1. OntoSoft: A Software Registry for Geosciences

    Science.gov (United States)

    Garijo, D.; Gil, Y.

    2017-12-01

    The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.

  2. Global Distribution of Net Electron Acceptance in Subseafloor Sediment

    Science.gov (United States)

    Fulfer, V. M.; Pockalny, R. A.; D'Hondt, S.

    2017-12-01

    We quantified the global distribution of net electron acceptance rates (e-/m2/year) in subseafloor sediment (>1.5 meters below seafloor [mbsf]) using (i) a modified version of the chemical-reaction-rate algorithm by Wang et al. (2008), (ii) physical properties and dissolved oxygen and sulfate data from interstitial waters of sediment cores collected by the Ocean Drilling Program, Integrated Ocean Drilling Program, International Ocean Discovery Program, and U.S. coring expeditions, and (iii) correlation of net electron acceptance rates to global oceanographic properties. Calculated net rates vary from 4.8 x 1019 e-/m2/year for slowly accumulating abyssal clay to 1.2 x 1023 e-/m2/year for regions of high sedimentation rate. Net electron acceptance rate correlates strongly with mean sedimentation rate. Where sedimentation rate is very low (e.g., 1 m/Myr), dissolved oxygen penetrates more than 70 mbsf and is the primary terminal electron acceptor. Where sedimentation rate is moderate (e.g., 3 to 60 m/Myr), dissolved sulfate penetrates as far as 700 mbsf and is the principal terminal electron acceptor. Where sedimentation rate is high (e.g., > 60 m/Myr), dissolved sulfate penetrates only meters, but is the principal terminal electron acceptor in subseafloor sediment to the depth of sulfate penetration. Because microbial metabolism continues at greater depths than the depth of sulfate penetration in fast-accumulating sediment, complete quantification of subseafloor metabolic rates will require consideration of other chemical species.

  3. Global Distribution of Planetary Boundary Layer Height Derived from CALIPSO

    Science.gov (United States)

    Huang, J.

    2015-12-01

    The global distribution of planetary boundary layer (PBL) height, which was estimated from the attenuated back-scatter observations of Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO), is presented. In general, the PBL is capped by a temperature inversion that tends to trap moisture and aerosols. The gradient of back-scatter observed by lidar is almost always associated with this temperature inversion and the simultaneous decrease of moisture content. Thus, the PBL top is defined as the location of the maximum aerosol scattering gradient, which is analogous to the more conventional thermodynamic definition. The maximum standard deviation method, developed by Jordan et al. (2010), is modified and used to derive the global PBL heights. The derived PBL heights are not only consistent with the results of McGrath-Spangler and Denning (2012) but also agree well with the ground-based lidar measurements. It is found that the correlation between CALIPSO and the ground-based lidar was 0.73. The seasonal mean patterns from 4-year mid-day PBL heights over global are demonstrated. Also it is found that the largest PBL heights occur over the Tibetan Plateau and the coastal areas. The smallest PBL heights appear in the Tarim Basin and the northeast of China during the local winter. The comparison of PBL heights from CALIPSO and ECMWF under different land-cover conditions showed that, over ocean and forest surface, the PBL height estimated from the CALIPSO back-scatter climatology is larger than the ones estimated from ECMWF data. However, the PBL heights from ECMWF, over grass land and bare land surface in spring and summer are larger than the ones from CALIPSO.

  4. Income Disparities and the Global Distribution of Intensively Farmed Chicken and Pigs.

    Directory of Open Access Journals (Sweden)

    Marius Gilbert

    Full Text Available The rapid transformation of the livestock sector in recent decades brought concerns on its impact on greenhouse gas emissions, disruptions to nitrogen and phosphorous cycles and on land use change, particularly deforestation for production of feed crops. Animal and human health are increasingly interlinked through emerging infectious diseases, zoonoses, and antimicrobial resistance. In many developing countries, the rapidity of change has also had social impacts with increased risk of marginalisation of smallholder farmers. However, both the impacts and benefits of livestock farming often differ between extensive (backyard farming mostly for home-consumption and intensive, commercial production systems (larger herd or flock size, higher investments in inputs, a tendency towards market-orientation. A density of 10,000 chickens per km2 has different environmental, epidemiological and societal implications if these birds are raised by 1,000 individual households or in a single industrial unit. Here, we introduce a novel relationship that links the national proportion of extensively raised animals to the gross domestic product (GDP per capita (in purchasing power parity. This relationship is modelled and used together with the global distribution of rural population to disaggregate existing 10 km resolution global maps of chicken and pig distributions into extensive and intensive systems. Our results highlight countries and regions where extensive and intensive chicken and pig production systems are most important. We discuss the sources of uncertainties, the modelling assumptions and ways in which this approach could be developed to forecast future trajectories of intensification.

  5. Income Disparities and the Global Distribution of Intensively Farmed Chicken and Pigs.

    Science.gov (United States)

    Gilbert, Marius; Conchedda, Giulia; Van Boeckel, Thomas P; Cinardi, Giuseppina; Linard, Catherine; Nicolas, Gaëlle; Thanapongtharm, Weerapong; D'Aietti, Laura; Wint, William; Newman, Scott H; Robinson, Timothy P

    2015-01-01

    The rapid transformation of the livestock sector in recent decades brought concerns on its impact on greenhouse gas emissions, disruptions to nitrogen and phosphorous cycles and on land use change, particularly deforestation for production of feed crops. Animal and human health are increasingly interlinked through emerging infectious diseases, zoonoses, and antimicrobial resistance. In many developing countries, the rapidity of change has also had social impacts with increased risk of marginalisation of smallholder farmers. However, both the impacts and benefits of livestock farming often differ between extensive (backyard farming mostly for home-consumption) and intensive, commercial production systems (larger herd or flock size, higher investments in inputs, a tendency towards market-orientation). A density of 10,000 chickens per km2 has different environmental, epidemiological and societal implications if these birds are raised by 1,000 individual households or in a single industrial unit. Here, we introduce a novel relationship that links the national proportion of extensively raised animals to the gross domestic product (GDP) per capita (in purchasing power parity). This relationship is modelled and used together with the global distribution of rural population to disaggregate existing 10 km resolution global maps of chicken and pig distributions into extensive and intensive systems. Our results highlight countries and regions where extensive and intensive chicken and pig production systems are most important. We discuss the sources of uncertainties, the modelling assumptions and ways in which this approach could be developed to forecast future trajectories of intensification.

  6. Income Disparities and the Global Distribution of Intensively Farmed Chicken and Pigs

    Science.gov (United States)

    Gilbert, Marius; Conchedda, Giulia; Van Boeckel, Thomas P.; Cinardi, Giuseppina; Linard, Catherine; Nicolas, Gaëlle; Thanapongtharm, Weerapong; D'Aietti, Laura; Wint, William; Newman, Scott H.; Robinson, Timothy P.

    2015-01-01

    The rapid transformation of the livestock sector in recent decades brought concerns on its impact on greenhouse gas emissions, disruptions to nitrogen and phosphorous cycles and on land use change, particularly deforestation for production of feed crops. Animal and human health are increasingly interlinked through emerging infectious diseases, zoonoses, and antimicrobial resistance. In many developing countries, the rapidity of change has also had social impacts with increased risk of marginalisation of smallholder farmers. However, both the impacts and benefits of livestock farming often differ between extensive (backyard farming mostly for home-consumption) and intensive, commercial production systems (larger herd or flock size, higher investments in inputs, a tendency towards market-orientation). A density of 10,000 chickens per km2 has different environmental, epidemiological and societal implications if these birds are raised by 1,000 individual households or in a single industrial unit. Here, we introduce a novel relationship that links the national proportion of extensively raised animals to the gross domestic product (GDP) per capita (in purchasing power parity). This relationship is modelled and used together with the global distribution of rural population to disaggregate existing 10 km resolution global maps of chicken and pig distributions into extensive and intensive systems. Our results highlight countries and regions where extensive and intensive chicken and pig production systems are most important. We discuss the sources of uncertainties, the modelling assumptions and ways in which this approach could be developed to forecast future trajectories of intensification. PMID:26230336

  7. Handbook of distribution

    International Nuclear Information System (INIS)

    Mo, In Gyu

    1992-01-01

    This book tells of business strategy and distribution innovation, purpose of intelligent distribution, intelligent supply distribution, intelligent production distribution, intelligent sale distribution software for intelligence and future and distribution. It also introduces component technology keeping intelligent distribution such as bar cord, OCR, packing, and intelligent auto-warehouse, system technology, and cases in America, Japan and other countries.

  8. 6th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    2016-01-01

    This edited book presents scientific results of the 16th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2015) which was held on June 1 – 3, 2015 in Takamatsu, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  9. 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    SNPD 2016

    2016-01-01

    This edited book presents scientific results of the 17th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2016) which was held on May 30 - June 1, 2016 in Shanghai, China. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  10. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping.

    Science.gov (United States)

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon's conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team.

  11. A software product line approach to enhance a meta-scheduler middleware

    International Nuclear Information System (INIS)

    Scheidt, Rafael F; Schmidt, Katreen; Pessoa, Gabriel M; Viera, Matheus A; Dantas, Mario

    2012-01-01

    Software Projects in general tend to get more software reuse and componentization in order to reduce time, cost and new products resources. The need for techniques and tools to organize projects of higher quality in less time is one of the greatest challenges of Software Engineering. The Software Product Line is proposed to organize and systematically assist the development of new products in series at the same domain. In this context, this paper is proposed to apply the Software Product Line approach in Distributed Computing Environments. In projects that involve Distributed Environments, each version of the same product can generate repeatedly the same artifacts in a product that evolves its characteristics; however there is a principal architecture with variations of components. The goal of the proposed approach is to analyze the actual process and propose a new approach to develop new projects reusing the whole architecture, components and documents, starting with a solid base and creating new products focusing in new functionalities. We expect that with the application of this approach give support to the development of projects in Distributed Computing Environment.

  12. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    Kulabukhova, N.; Ivanov, A.; Korkhov, V.; Lazarev, A.

    2012-01-01

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  13. Software architecture considerations for ion source control systems

    International Nuclear Information System (INIS)

    Sinclair, J.W.

    1997-09-01

    General characteristics of distributed control system software tools are examined from the perspective of ion source control system requirements. Emphasis is placed on strategies for building extensible, distributed systems in which the ion source element is one component of a larger system. Vsystem, a commercial software tool kit from Vista Control Systems was utilized extensively in the control system upgrade of the Holifield Radioactive Ion Beam Facility. Part of the control system is described and the characteristics of Vsystem are examined and compared with those of EPICS, the Experimental Physics and Industrial Control System

  14. Global exponential stability analysis on impulsive BAM neural networks with distributed delays

    Science.gov (United States)

    Li, Yao-Tang; Yang, Chang-Bo

    2006-12-01

    Using M-matrix and topological degree tool, sufficient conditions are obtained for the existence, uniqueness and global exponential stability of the equilibrium point of bidirectional associative memory (BAM) neural networks with distributed delays and subjected to impulsive state displacements at fixed instants of time by constructing a suitable Lyapunov functional. The results remove the usual assumptions that the boundedness, monotonicity, and differentiability of the activation functions. It is shown that in some cases, the stability criteria can be easily checked. Finally, an illustrative example is given to show the effectiveness of the presented criteria.

  15. 3D Visualization of Global Ocean Circulation

    Science.gov (United States)

    Nelson, V. G.; Sharma, R.; Zhang, E.; Schmittner, A.; Jenny, B.

    2015-12-01

    Advanced 3D visualization techniques are seldom used to explore the dynamic behavior of ocean circulation. Streamlines are an effective method for visualization of flow, and they can be designed to clearly show the dynamic behavior of a fluidic system. We employ vector field editing and extraction software to examine the topology of velocity vector fields generated by a 3D global circulation model coupled to a one-layer atmosphere model simulating preindustrial and last glacial maximum (LGM) conditions. This results in a streamline-based visualization along multiple density isosurfaces on which we visualize points of vertical exchange and the distribution of properties such as temperature and biogeochemical tracers. Previous work involving this model examined the change in the energetics driving overturning circulation and mixing between simulations of LGM and preindustrial conditions. This visualization elucidates the relationship between locations of vertical exchange and mixing, as well as demonstrates the effects of circulation and mixing on the distribution of tracers such as carbon isotopes.

  16. Parasail: SIMD C library for global, semi-global, and local pairwise sequence alignments.

    Science.gov (United States)

    Daily, Jeff

    2016-02-10

    Sequence alignment algorithms are a key component of many bioinformatics applications. Though various fast Smith-Waterman local sequence alignment implementations have been developed for x86 CPUs, most are embedded into larger database search tools. In addition, fast implementations of Needleman-Wunsch global sequence alignment and its semi-global variants are not as widespread. This article presents the first software library for local, global, and semi-global pairwise intra-sequence alignments and improves the performance of previous intra-sequence implementations. A faster intra-sequence local pairwise alignment implementation is described and benchmarked, including new global and semi-global variants. Using a 375 residue query sequence a speed of 136 billion cell updates per second (GCUPS) was achieved on a dual Intel Xeon E5-2670 24-core processor system, the highest reported for an implementation based on Farrar's 'striped' approach. Rognes's SWIPE optimal database search application is still generally the fastest available at 1.2 to at best 2.4 times faster than Parasail for sequences shorter than 500 amino acids. However, Parasail was faster for longer sequences. For global alignments, Parasail's prefix scan implementation is generally the fastest, faster even than Farrar's 'striped' approach, however the opal library is faster for single-threaded applications. The software library is designed for 64 bit Linux, OS X, or Windows on processors with SSE2, SSE41, or AVX2. Source code is available from https://github.com/jeffdaily/parasail under the Battelle BSD-style license. Applications that require optimal alignment scores could benefit from the improved performance. For the first time, SIMD global, semi-global, and local alignments are available in a stand-alone C library.

  17. Exoskeletons, Robots and System Software: Tools for the Warfighter

    Science.gov (United States)

    2012-04-24

    Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots , drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that

  18. Improving Distributed Denial of Service (DDOS Detection using Entropy Method in Software Defined Network (SDN

    Directory of Open Access Journals (Sweden)

    Maman Abdurohman

    2017-12-01

    Full Text Available This research proposed a new method to enhance Distributed Denial of Service (DDoS detection attack on Software Defined Network (SDN environment. This research utilized the OpenFlow controller of SDN for DDoS attack detection using modified method and regarding entropy value. The new method would check whether the traffic was a normal traffic or DDoS attack by measuring the randomness of the packets. This method consisted of two steps, detecting attack and checking the entropy. The result shows that the new method can reduce false positive when there is a temporary and sudden increase in normal traffic. The new method succeeds in not detecting this as a DDoS attack. Compared to previous methods, this proposed method can enhance DDoS attack detection on SDN environment.

  19. Identifying Coordination Problems in Software Development : Finding Mismatches between Software and Project Team Structures

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos; Kumar, Kuldeep

    2012-01-01

    Today’s dynamic and iterative development environment brings significant challenges for software project management. In distributed project settings, “management by walking around” is no longer an option and project managers may miss out on key project insights. The TESNA (TEchnical Social Network

  20. Data acquisition software for DIRAC experiment

    International Nuclear Information System (INIS)

    Ol'shevskij, V.G.; Trusov, S.V.

    2000-01-01

    The structure and basic processes of data acquisition software of DIRAC experiment for the measurement of π + π - atom life-time are described. The experiment is running on PS accelerator of CERN. The developed software allows one to accept, record and distribute to consumers up to 3 Mbytes of data in one accelerator supercycle of 14.4 s duration. The described system is used successfully in the DIRAC experiment starting from 1998 year

  1. Data acquisition software for DIRAC experiment

    Science.gov (United States)

    Olshevsky, V.; Trusov, S.

    2001-08-01

    The structure and basic processes of data acquisition software of the DIRAC experiment for the measurement of π +π - atom lifetime are described. The experiment is running on the PS accelerator of CERN. The developed software allows one to accept, record and distribute up to 3 Mbytes of data to consumers in one accelerator supercycle of 14.4 s duration. The described system is successfully in use in the experiment since its startup in 1998.

  2. Data acquisition software for DIRAC experiment

    International Nuclear Information System (INIS)

    Olshevsky, V.; Trusov, S.

    2001-01-01

    The structure and basic processes of data acquisition software of the DIRAC experiment for the measurement of π + π - atom lifetime are described. The experiment is running on the PS accelerator of CERN. The developed software allows one to accept, record and distribute up to 3 Mbytes of data to consumers in one accelerator supercycle of 14.4 s duration. The described system is successfully in use in the experiment since its startup in 1998

  3. Software for Simulation of Hyperspectral Images

    Science.gov (United States)

    Richtsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.

    2002-01-01

    A package of software generates simulated hyperspectral images for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport as well as surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, 'ground truth' is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces and the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for and a supplement to field validation data.

  4. Software life cycle methodologies and environments

    Science.gov (United States)

    Fridge, Ernest

    1991-01-01

    Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.

  5. Software Engineering Issues for Cyber-Physical Systems

    DEFF Research Database (Denmark)

    Al-Jaroodi, Jameela; Mohamed, Nader; Jawhar, Imad

    2016-01-01

    step; however, designing and implementing the right software to integrate and use them effectively is essential. The software facilitates better interfaces, more control and adds smart services, high flexibility and many other added values and features to the CPS. However, software development for CPS......Cyber-Physical Systems (CPS) provide many smart features for enhancing physical processes. These systems are designed with a set of distributed hardware, software, and network components that are embedded in physical systems and environments or attached to humans. Together they function seamlessly...... to offer specific functionalities or features that help enhance human lives, operations or environments. While different CPS components play important roles in a successful CPS development, the software plays the most important role among them. Acquiring and using high quality CPS components is the first...

  6. ROSMOD: A Toolsuite for Modeling, Generating, Deploying, and Managing Distributed Real-time Component-based Software using ROS

    Directory of Open Access Journals (Sweden)

    Pranav Srinivas Kumar

    2016-09-01

    Full Text Available This paper presents the Robot Operating System Model-driven development tool suite, (ROSMOD an integrated development environment for rapid prototyping component-based software for the Robot Operating System (ROS middleware. ROSMOD is well suited for the design, development and deployment of large-scale distributed applications on embedded devices. We present the various features of ROSMOD including the modeling language, the graphical user interface, code generators, and deployment infrastructure. We demonstrate the utility of this tool with a real-world case study: an Autonomous Ground Support Equipment (AGSE robot that was designed and prototyped using ROSMOD for the NASA Student Launch competition, 2014–2015.

  7. Software and the Virus Threat: Providing Authenticity in Distribution

    Science.gov (United States)

    1991-03-01

    x . . . x x x x . 2100 Plastique (9) . x x x x x . . x x , . . 3012 Wo ian (2) . x x x x . . . x x . . . . 2064 Doom Z x . x x. . x x x x 2504 Flip...shot+ Software Concepts Design (212) 889 6438 594 Third Avenue New York, NY 10016 FShield McAfee Associates (408) 988-3832 4423 Cheeney Street Santa

  8. GPS Software Packages Deliver Positioning Solutions

    Science.gov (United States)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  9. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  10. A software system for oilfield facility investment minimization

    International Nuclear Information System (INIS)

    Ding, Z.X.; Startzman, R.A.

    1996-01-01

    Minimizing investment in oilfield development is an important subject that has attracted a considerable amount of industry attention. One method to reduce investment involves the optimal placement and selection of production facilities. Because of the large amount of capital used in this process, saving a small percent of the total investment may represent a large monetary value. The literature reports algorithms using mathematical programming techniques that were designed to solve the proposed problem in a global optimal manner. Owing to the high-computational complexity and the lack of user-friendly interfaces for data entry and results display, mathematical programming techniques have not been given enough attention in practice. This paper describes an interactive, graphical software system that provides a global optimal solution to the problem of placement and selection of production facilities in oil-field development processes. This software system can be used as an investment minimization tool and a scenario-study simulator. The developed software system consists of five basic modules: (1) an interactive data-input unit, (2) a cost function generator, (3) an optimization unit, (4) a graphic-output display, and (5) a sensitivity-analysis unit

  11. Software Architecture for Distributed Real-Time Embedded Systems

    National Research Council Canada - National Science Library

    Almeida, Jose

    1998-01-01

    .... This thesis focuses on the distributed scheduling problem. It proposes a distributed scheduling algorithm to allocate and schedule a set of tasks onto a collection of processors linked by a network...

  12. PACMAN: PRIMA astrometric instrument software

    Science.gov (United States)

    Abuter, Roberto; Sahlmann, Johannes; Pozna, Eszter

    2010-07-01

    The dual feed astrometric instrument software of PRIMA (PACMAN) that is currently being integrated at the VLTI will use two spatially modulated fringe sensor units and a laser metrology system to carry out differential astrometry. Its software and hardware compromises a distributed system involving many real time computers and workstations operating in a synchronized manner. Its architecture has been designed to allow the construction of efficient and flexible calibration and observation procedures. In parallel, a novel scheme of integrating M-code (MATLAB/OCTAVE) with standard VLT (Very Large Telescope) control software applications had to be devised in order to support numerically intensive operations and to have the capacity of adapting to fast varying strategies and algorithms. This paper presents the instrument software, including the current operational sequences for the laboratory calibration and sky calibration. Finally, a detailed description of the algorithms with their implementation, both under M and C code, are shown together with a comparative analysis of their performance and maintainability.

  13. Using containers with ATLAS offline software

    CERN Document Server

    Vogel, Marcelo; The ATLAS collaboration

    2017-01-01

    This paper describes the deployment of ATLAS offline software in containers for software development. For this we are using Docker, which is a lightweight virtualization technology that encapsulates a piece of software inside a complete file system. The deployment of offline releases via containers removes the strict requirement of compatibility between the runtime environment needed for job execution and the configuration of worker nodes at computing sites. If these two are decoupled from each other, sites can upgrade their nodes whenever and however they see fit. In this work, ATLAS software is distributed in containers either via the CernVM File System (CVMFS) or by means of a full ATLAS offline release installation. In software development, separating the build and runtime environment from the development environment allows users to take advantage of many modern code development tools that may not be available in production runtime setups like SLC6. It also frees developers from depending on resources lik...

  14. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  15. Offering Global Collaboration Services beyond CERN and HEP

    CERN Document Server

    Fernandes, J; Baron, T

    2015-01-01

    The CERN IT department has built over the years a performant and integrated ecosystem of collaboration tools, from videoconference and webcast services to event management software. These services have been designed and evolved in very close collaboration with the various communities surrounding the laboratory and have been massively adopted by CERN users. To cope with this very heavy usage, global infrastructures have been deployed which take full advantage of CERN's international and global nature. If these services and tools are instrumental in enabling the worldwide collaboration which generates major HEP breakthroughs, they would certainly also benefit other sectors of science in which globalization has already taken place. Some of these services are driven by commercial software (Vidyo or Wowza for example), some others have been developed internally and have already been made available to the world as Open Source Software in line with CERN's spirit and mission. Indico for example is now installed in 10...

  16. Data acquisition software for DIRAC experiment

    CERN Document Server

    Olshevsky, V G

    2001-01-01

    The structure and basic processes of data acquisition software of the DIRAC experiment for the measurement of pi /sup +/ pi /sup -/ atom lifetime are described. The experiment is running on the PS accelerator of CERN. The developed software allows one to accept, record and distribute up to 3 Mbytes of data to consumers in one accelerator supercycle of 14.4 s duration. The described system is successfully in use in the experiment since its startup in 1998. (13 refs).

  17. Chinese and global distribution of H9 subtype avian influenza viruses.

    Directory of Open Access Journals (Sweden)

    Wenming Jiang

    Full Text Available H9 subtype avian influenza viruses (AIVs are of significance in poultry and public health, but epidemiological studies about the viruses are scarce. In this study, phylogenetic relationships of the viruses were analyzed based on 1233 previously reported sequences and 745 novel sequences of the viral hemagglutinin gene. The novel sequences were obtained through large-scale surveys conducted in 2008-2011 in China. The results revealed distinct distributions of H9 subtype AIVs in different hosts, sites and regions in China and in the world: (1 the dominant lineage of H9 subtype AIVs in China in recent years is lineage h9.4.2.5 represented by A/chicken/Guangxi/55/2005; (2 the newly emerging lineage h9.4.2.6, represented by A/chicken/Guangdong/FZH/2011, has also become prevalent in China; (3 lineages h9.3.3, h9.4.1 and h9.4.2, represented by A/duck/Hokkaido/26/99, A/quail/Hong Kong/G1/97 and A/chicken/Hong Kong/G9/97, respectively, have become globally dominant in recent years; (4 lineages h9.4.1 and h9.4.2 are likely of more risk to public health than others; (5 different lineages have different transmission features and host tropisms. This study also provided novel experimental data which indicated that the Leu-234 (H9 numbering motif in the viral hemagglutinin gene is an important but not unique determinant in receptor-binding preference. This report provides a detailed and updated panoramic view of the epidemiological distributions of H9 subtype AIVs globally and in China, and sheds new insights for the prevention of infection in poultry and preparedness for a potential pandemic caused by the viruses.

  18. Insights into global diatom distribution and diversity in the world’s ocean

    KAUST Repository

    Malviya, Shruti; Scalco, Eleonora; Audic, Sté phane; Vincent, Flora; Veluchamy, Alaguraj; Poulain, Julie; Wincker, Patrick; Iudicone, Daniele; de Vargas, Colomban; Bittner, Lucie; Zingone, Adriana; Bowler, Chris

    2016-01-01

    Diatoms (Bacillariophyta) constitute one of the most diverse and ecologically important groups of phytoplankton. They are considered to be particularly important in nutrient-rich coastal ecosystems and at high latitudes, but considerably less so in the oligotrophic open ocean. The Tara Oceans circumnavigation collected samples from a wide range of oceanic regions using a standardized sampling procedure. Here, a total of ∼12 million diatom V9-18S ribosomal DNA (rDNA) ribotypes, derived from 293 sizefractionated plankton communities collected at 46 sampling sites across the global ocean euphotic zone, have been analyzed to explore diatom global diversity and community composition. We provide a new estimate of diversity of marine planktonic diatoms at 4,748 operational taxonomic units (OTUs). Based on the total assigned ribotypes, Chaetoceros was the most abundant and diverse genus, followed by Fragilariopsis, Thalassiosira, and Corethron. We found only a few cosmopolitan ribotypes displaying an even distribution across stations and high abundance, many of which could not be assigned with confidence to any known genus. Three distinct communities from South Pacific, Mediterranean, and Southern Ocean waters were identified that share a substantial percentage of ribotypes within them. Sudden drops in diversity were observed at Cape Agulhas, which separates the Indian and Atlantic Oceans, and across the Drake Passage between the Atlantic and Southern Oceans, indicating the importance of these ocean circulation choke points in constraining diatom distribution and diversity. We also observed high diatom diversity in the open ocean, suggesting that diatoms may be more relevant in these oceanic systems than generally considered.

  19. Insights into global diatom distribution and diversity in the world’s ocean

    KAUST Repository

    Malviya, Shruti

    2016-03-01

    Diatoms (Bacillariophyta) constitute one of the most diverse and ecologically important groups of phytoplankton. They are considered to be particularly important in nutrient-rich coastal ecosystems and at high latitudes, but considerably less so in the oligotrophic open ocean. The Tara Oceans circumnavigation collected samples from a wide range of oceanic regions using a standardized sampling procedure. Here, a total of ∼12 million diatom V9-18S ribosomal DNA (rDNA) ribotypes, derived from 293 sizefractionated plankton communities collected at 46 sampling sites across the global ocean euphotic zone, have been analyzed to explore diatom global diversity and community composition. We provide a new estimate of diversity of marine planktonic diatoms at 4,748 operational taxonomic units (OTUs). Based on the total assigned ribotypes, Chaetoceros was the most abundant and diverse genus, followed by Fragilariopsis, Thalassiosira, and Corethron. We found only a few cosmopolitan ribotypes displaying an even distribution across stations and high abundance, many of which could not be assigned with confidence to any known genus. Three distinct communities from South Pacific, Mediterranean, and Southern Ocean waters were identified that share a substantial percentage of ribotypes within them. Sudden drops in diversity were observed at Cape Agulhas, which separates the Indian and Atlantic Oceans, and across the Drake Passage between the Atlantic and Southern Oceans, indicating the importance of these ocean circulation choke points in constraining diatom distribution and diversity. We also observed high diatom diversity in the open ocean, suggesting that diatoms may be more relevant in these oceanic systems than generally considered.

  20. A comparative study of software adaptation using remote method call and Web Services

    Directory of Open Access Journals (Sweden)

    AFFONSO, F. J.

    2011-06-01

    Full Text Available The software development process has been directed, over the years, to various methodologies with specific purposes to attend emerging needs. Besides, it can also be noticed, during this period, that some processes require mechanisms related to software reuse and greater speed in the development stage. An important factor in this context is the mutation (adaptation, which occurs in all the software's life cycle, due to its customers' needs or due to technological changes. Regarding the latter factor, it has been observed a significant increase in developments that use distributed applications through the World Wide Web or remote application. Based on the adaptation idea and on the necessity of software distribution systems, this paper presents a technique to reconfigure software capable of acting in several developmental contexts (local, distributed and/or Web. In order to demonstrate its applicability, a case study, through the use of service orientation and remote calls, was done to show the software adaptation in the development of applications. Besides, comparative results among the approaches used in the development of reconfigurable applications are also presented.