WorldWideScience

Sample records for network analysis software

  1. Network-based analysis of software change propagation.

    Science.gov (United States)

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  2. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  3. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  4. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Hansen, Jonas; Roetter, Daniel Enrique Lucani; Krigslund, Jeppe

    2015-01-01

    . The inherent flexibility of both SDN and NC provides fertile ground to envision more efficient, robust, and secure networking designs, which may also incorporate content caching and storage, all of which are key challenges of the upcoming 5G networks. This article not only proposes the fundamentals......Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm...

  5. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...... incorporate content caching and storage, all of which are key challenges of the future Internet and the upcoming 5G networks. This paper proposes some of the keys behind this intersection and supports it with use cases as well as a an implementation that integrated the Kodo library (NC) into OpenFlow (SDN...

  6. Netlang: A software for the linguistic analysis of corpora by means of complex networks.

    Science.gov (United States)

    Barceló-Coblijn, Lluís; Serna Salazar, Diego; Isaza, Gustavo; Castillo Ossa, Luis F; Bedia, Manuel G

    2017-01-01

    To date there is no software that directly connects the linguistic analysis of a conversation to a network program. Networks programs are able to extract statistical information from data basis with information about systems of interacting elements. Language has also been conceived and studied as a complex system. However, most proposals do not analyze language according to linguistic theory, but use instead computational systems that should save time at the price of leaving aside many crucial aspects for linguistic theory. Some approaches to network studies on language do apply precise linguistic analyses, made by a linguist. The problem until now has been the lack of interface between the analysis of a sentence and its integration into the network that could be managed by a linguist and that could save the analysis of any language. Previous works have used old software that was not created for these purposes and that often produced problems with some idiosyncrasies of the target language. The desired interface should be able to deal with the syntactic peculiarities of a particular language, the options of linguistic theory preferred by the user and the preservation of morpho-syntactic information (lexical categories and syntactic relations between items). Netlang is the first program able to do that. Recently, a new kind of linguistic analysis has been developed, which is able to extract a complexity pattern from the speaker's linguistic production which is depicted as a network where words are inside nodes, and these nodes connect each other by means of edges or links (the information inside the edge can be syntactic, semantic, etc.). The Netlang software has become the bridge between rough linguistic data and the network program. Netlang has integrated and improved the functions of programs used in the past, namely the DGA annotator and two scripts (ToXML.pl and Xml2Pairs.py) used for transforming and pruning data. Netlang allows the researcher to make accurate

  7. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...... networks are designed and managed. This thesis argues that SDN can greatly simplify QoS provisioning in communication networks, and even improve QoS in various ways. To this end, the impact of SDN on QoS is assessed from both a network performance perspective (e.g. bandwidth, delay), and also from a more...... generic perspective (e.g. service provisioning speed, resources availability). As a result, new mechanisms for providing QoS are proposed, solutions for SDN-specific QoS challenges are designed and tested, and new network management concepts are prototyped, all aiming to improve QoS for network services...

  8. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    , we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges......Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper...

  9. SBA Network Components & Software Inventory

    Data.gov (United States)

    Small Business Administration — SBA’s Network Components & Software Inventory contains a complete inventory of all devices connected to SBA’s network including workstations, servers, routers,...

  10. SPSens: a software package for stochastic parameter sensitivity analysis of biochemical reaction networks.

    Science.gov (United States)

    Sheppard, Patrick W; Rathinam, Muruhan; Khammash, Mustafa

    2013-01-01

    SPSens is a software package for the efficient computation of stochastic parameter sensitivities of biochemical reaction networks. Parameter sensitivity analysis is a valuable tool that can be used to study robustness properties, for drug targeting, and many other purposes. However its application to stochastic models has been limited when Monte Carlo methods are required due to extremely high computational costs. SPSens provides efficient, state of the art sensitivity analysis algorithms in a single software package so that sensitivity analysis can be easily performed on stochastic models of biochemical reaction networks. SPSens implements the algorithms in C and estimates sensitivities with respect to both infinitesimal and finite perturbations to system parameters, in many cases reducing variance by orders of magnitude compared to basic methods. Included among the features of SPSens are serial and parallel command line versions, an interface with Matlab, and several example problems. SPSens is distributed freely under GPL version 3 and can be downloaded from http://sourceforge.net/projects/spsens/. The software can be run on Linux, Mac OS X and Windows platforms.

  11. Efficiency of Software Testing Techniques: A Controlled Experiment Replication and Network Meta-analysis

    Directory of Open Access Journals (Sweden)

    Omar S. Gómez

    2017-07-01

    Full Text Available Background: Common approaches to software verification include static testing techniques, such as code reading, and dynamic testing techniques, such as black-box and white-box testing. Objective: With the aim of gaining a~better understanding of software testing techniques, a~controlled experiment replication and the synthesis of previous experiments which examine the efficiency of code reading, black-box and white-box testing techniques were conducted. Method: The replication reported here is composed of four experiments in which instrumented programs were used. Participants randomly applied one of the techniques to one of the instrumented programs. The outcomes were synthesized with seven experiments using the method of network meta-analysis (NMA. Results: No significant differences in the efficiency of the techniques were observed. However, it was discovered the instrumented programs had a~significant effect on the efficiency. The NMA results suggest that the black-box and white-box techniques behave alike; and the efficiency of code reading seems to be sensitive to other factors. Conclusion: Taking into account these findings, the Authors suggest that prior to carrying out software verification activities, software engineers should have a~clear understanding of the software product to be verified; they can apply either black-box or white-box testing techniques as they yield similar defect detection rates.

  12. Interuniversity Upper Atmosphere Global Observation Network (IUGONET Meta-Database and Analysis Software

    Directory of Open Access Journals (Sweden)

    A Yatagai

    2014-09-01

    Full Text Available An overview of the Interuniversity Upper atmosphere Global Observation NETwork (IUGONET project is presented. This Japanese program is building a meta-database for ground-based observations of the Earth’s upper atmosphere, in which metadata connected with various atmospheric radars and photometers, including those located in both polar regions, are archived. By querying the metadata database, researchers are able to access data file/information held by data facilities. Moreover, by utilizing our analysis software, users can download, visualize, and analyze upper-atmospheric data archived in or linked with the system. As a future development, we are looking to make our database interoperable with others.

  13. Software and Network Engineering

    CERN Document Server

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the first ACIS International Symposium on Software and Network Engineering held on Decembe...

  14. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  15. Towards a network ecology of software ecosystems

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Manikas, Konstantinos

    2013-01-01

    "Software ecosystems'' are gaining importance in commercial software development; the iPhone iOS and Salesforce.com ecosystems are examples of this. In contrast to traditional forms of software reuse, such as common platforms or product lines, software ecosystems have a heterogeneous set of actors...... sharing and collaborating over one or more technological platforms and business model(s) that serve the actors. However, little research has investigated the properties of actual software ecosystems. In this paper, we present an exploratory study of software ecosystems using the formalizations and metrics...... of the "network ecology'' approach to the analysis of natural ecosystems. In doing so, we mine the Maven central Java repository and analyze two OSGi ecosystems: Apache Felix and Eclipse Equinox. In particular, we define the concept of an ecosystem ``neighborhood'', apply network ecology metrics...

  16. Instrumental Supporting System for Developing and Analysis of Software-Defined Networks of Mobile Objects

    Directory of Open Access Journals (Sweden)

    V. A. Sokolov

    2015-01-01

    Full Text Available This article describes the organization principles for wireless mesh-networks (software-defined net-works of mobile objects. The emphasis is on the questions of getting effective routing algorithms for such networks. The mathematical model of the system is the standard transportation network. The key parameter of the routing system is the node reachability coefficient — the function depending on several basic and additional parameters (“mesh-factors”, which characterize the route between two network nodes. Each pair (arc, node is juxtaposed to a composite parameter which characterizes the “reacha-bility” of the node by the route which begins with this arc. The best (“shortest” route between two nodes is the route with the maximum reachability coefficient. The rules of building and refreshing the routing tables by the network nodes are described. With the announcement from the neighbor the node gets the information about the connection energy and reliability, the announcement time of receipt, the absence of transitional nodes and also about the connection capability. On the basis of this informationthe node applies the penalization (decreasing the reachability coefficient or the reward (increasing the reachability coefficient to all routes through this neighbor node. The penalization / reward scheme has some separate aspects: 1. Penalization for the actuality of information. 2. Penalization / reward for the reliability of a node. 3. Penalization for the connection energy. 4. Penalization for the present connection capability. The simulator of the wireless mesh-network of mobile objects is written. It is based on the suggested heuristic algorithms. The description and characteristics of the simulator are stated in the article. The peculiarities of its program realization are also examined.

  17. Network Analysis and Application Control Software based on Client-Server Architecture

    OpenAIRE

    Mohan, Ramya

    2013-01-01

    This paper outlines a comprehensive model to increase system efficiency, preserve network bandwidth, monitor incoming and outgoing packets, ensure the security of confidential files and reduce power wastage in an organization. This model illustrates the use and potential application of a Network Analysis Tool (NAT) in a multi-computer set-up of any scale. The model is designed to run in the background and not hamper any currently executing applications, while using minimum system resources. I...

  18. Social Networks in Software Process Improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research....... In the action research we have applied the framework of social network analysis and we show this can be used to understand the underlying structures of communication and knowledge sharing between software developers and managers. We show in detail how the analysis can be done and how the management can utilise...... the findings. From this we conclude that social network analysis was a useful framework together with accompanying tools and techniques. Copyright © 2009 John Wiley & Sons, Ltd....

  19. AcCNET (Accessory Genome Constellation Network): comparative genomics software for accessory genome analysis using bipartite networks.

    Science.gov (United States)

    Lanza, Val F; Baquero, Fernando; de la Cruz, Fernando; Coque, Teresa M

    2017-01-15

    AcCNET (Accessory genome Constellation Network) is a Perl application that aims to compare accessory genomes of a large number of genomic units, both at qualitative and quantitative levels. Using the proteomes extracted from the analysed genomes, AcCNET creates a bipartite network compatible with standard network analysis platforms. AcCNET allows merging phylogenetic and functional information about the concerned genomes, thus improving the capability of current methods of network analysis. The AcCNET bipartite network opens a new perspective to explore the pangenome of bacterial species, focusing on the accessory genome behind the idiosyncrasy of a particular strain and/or population. AcCNET is available under GNU General Public License version 3.0 (GPLv3) from http://sourceforge.net/projects/accnet CONTACT: : valfernandez.vf@gmail.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Using networking and communications software in business

    CERN Document Server

    McBride, PK

    2014-01-01

    Using Networking and Communications Software in Business covers the importance of networks in a business firm, the benefits of computer communications within a firm, and the cost-benefit in putting up networks in businesses. The book is divided into six parts. Part I looks into the nature and varieties of networks, networking standards, and network software. Part II discusses the planning of a networked system, which includes analyzing the requirements for the network system, the hardware for the network, and network management. The installation of the network system and the network managemen

  1. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  2. Lessons Learned from Applying Social Network Analysis on an Industrial Free/Libre/Open Source Software Ecosystem

    OpenAIRE

    Teixeira, Jose; Robles, Gregorio; González-Barahona, Jesús,

    2015-01-01

    Many software projects are no longer done in-house by a single organization. Instead, we are in a new age where software is developed by a networked community of individuals and organizations, which base their relations to each other on mutual interest. Paradoxically, recent research suggests that software development can actually be jointly-developed by rival firms. For instance, it is known that the mobile-device makers Apple and Samsung kept collaborating in open source projects while runn...

  3. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...

  4. Dynamic Network Security Control Using Software Defined Networking

    Science.gov (United States)

    2016-03-24

    not subject to copyright protection in the United States. AFIT-ENG-MS-16-M-049 DYNAMIC NETWORK SECURITY CONTROL USING SOFTWARE DEFINED NETWORKING... software and tools vetted by industry leaders in networking and security. After considering the technologies previously discussed, the four components...DYNAMIC NETWORK SECURITY CONTROL USING SOFTWARE DEFINED NETWORKING THESIS Michael C. Todd, Captain, USAF AFIT-ENG-MS-16-M-049 DEPARTMENT OF THE AIR

  5. Simulating awareness in global software engineering: a comparative analysis of Scrum and Agile Service Networks

    NARCIS (Netherlands)

    Tamburri, D.A.; Razo Zapata, I.S.; Fernandez, H.; Tedeschi, C.

    2012-01-01

    Abstract—Global software engineering (GSE) is a business strategy to realize a business idea (i.e. the development project) faster, through round-the-clock productivity. However, GSE creates a volatile and unstable process in which many actors interact together against unpredictable premises (e.g.

  6. Improving coordination in software development through social and technical network analysis

    NARCIS (Netherlands)

    Amrit, Chintan Amrit

    2008-01-01

    Today’s dynamic and distributed development environment brings significant challenges for software project management. In distributed project settings, “management by walking around” is no longer an option, and project managers may miss out on key project insights. At the same time, the high

  7. A software tool for network intrusion detection

    CSIR Research Space (South Africa)

    Van der Walt, C

    2012-10-01

    Full Text Available This presentation illustrates how a recently developed software tool enables operators to easily monitor a network and detect intrusions without requiring expert knowledge of network intrusion detections....

  8. Fast Recovery in Software-Defined Networks

    NARCIS (Netherlands)

    Van Adrichem, N.L.M.; Van Asten, B.J.; Kuipers, F.A.

    2014-01-01

    Although Software-Defined Networking and its implementation OpenFlow facilitate managing networks and enable dynamic network configuration, recovering from network failures in a timely manner remains non-trivial. The process of (a) detecting the failure, (b) communicating it to the controller and

  9. Spectral Graph Theory Analysis of Software-Defined Networks to Improve Performance and Security

    Science.gov (United States)

    2015-09-01

    congestion in the second figure. Node 3 is congested due to a DDOS attack, which is indicated by the shift of the phantom node to dominate 3λ and node...Network DMZ Demilitarized Zone AMQ Automated Malware Quarantine SNR Signal-to-Noise Ratio LTI Linear, Time Invariant bps Bits per Second DDOS ...this interface [22]. Examples of business applications are distributed denial of service ( DDOS ) protection, intrusion detection, and usage tracking

  10. FALCON: a software package for analysis of nestedness in bipartite networks

    OpenAIRE

    Beckett, Stephen J.; Boulton, Chris A.; Williams, Hywel T. P.

    2014-01-01

    Nestedness is a statistical measure used to interpret bipartite interaction data in several ecological and evolutionary contexts, e.g. biogeography (species-site relationships) and species interactions (plant-pollinator and host-parasite networks). Multiple methods have been used to evaluate nestedness, which differ in how the metrics for nestedness are determined. Furthermore, several different null models have been used to calculate statistical significance of nestedness scores. The profusi...

  11. A Software Tool of Technical and Financial-Economic Analysis for Acquistion of Broadband Radio PPDR Networks

    Directory of Open Access Journals (Sweden)

    Gierszal Henryk

    2016-12-01

    Full Text Available In this paper, we present a software tool that allows preparing econometric analyses aiming at selection of optimal business models for acquisition of broadband mobile networks by PPDR organizations. Such agencies often and often need broadband services to improve operational activities in order to increase the safety, security, and their effectiveness in day-to-day and crisis situations. Upgrade or migration to broadband networks needs careful decisions so as to find a justified trade-off between CAPEX and OPEX. The network evolution can be based on different business models but any approach cannot degrade the reliability, security, and resilience required by PSC.

  12. Deep space network software cost estimation model

    Science.gov (United States)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  13. Software defined networks a comprehensive approach

    CERN Document Server

    Goransson, Paul

    2014-01-01

    Software Defined Networks discusses the historical networking environment that gave rise to SDN, as well as the latest advances in SDN technology. The book gives you the state of the art knowledge needed for successful deployment of an SDN, including: How to explain to the non-technical business decision makers in your organization the potential benefits, as well as the risks, in shifting parts of a network to the SDN modelHow to make intelligent decisions about when to integrate SDN technologies in a networkHow to decide if your organization should be developing its own SDN applications or

  14. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories. Objective: In the

  15. Software for multistate analysis

    NARCIS (Netherlands)

    Willekens, Frans; Putter, H.

    2014-01-01

    Background: The growing interest in pathways, the increased availability of life-history data, innovations in statistical and demographic techniques, and advances in software technology have stimulated the development of software packages for multistate modeling of life histories.Objective: In the

  16. Topics on data transmission problem in software definition network

    Science.gov (United States)

    Gao, Wei; Liang, Li; Xu, Tianwei; Gan, Jianhou

    2017-08-01

    In normal computer networks, the data transmission between two sites go through the shortest path between two corresponding vertices. However, in the setting of software definition network (SDN), it should monitor the network traffic flow in each site and channel timely, and the data transmission path between two sites in SDN should consider the congestion in current networks. Hence, the difference of available data transmission theory between normal computer network and software definition network is that we should consider the prohibit graph structures in SDN, and these forbidden subgraphs represent the sites and channels in which data can't be passed by the serious congestion. Inspired by theoretical analysis of an available data transmission in SDN, we consider some computational problems from the perspective of the graph theory. Several results determined in the paper imply the sufficient conditions of data transmission in SDN in the various graph settings.

  17. Distributed controller clustering in software defined networks.

    Directory of Open Access Journals (Sweden)

    Ahmed Abdelaziz

    Full Text Available Software Defined Networking (SDN is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN SDN and Open Network Operating System (ONOS controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  18. Distributed controller clustering in software defined networks.

    Science.gov (United States)

    Abdelaziz, Ahmed; Fong, Ang Tan; Gani, Abdullah; Garba, Usman; Khan, Suleman; Akhunzada, Adnan; Talebian, Hamid; Choo, Kim-Kwang Raymond

    2017-01-01

    Software Defined Networking (SDN) is an emerging promising paradigm for network management because of its centralized network intelligence. However, the centralized control architecture of the software-defined networks (SDNs) brings novel challenges of reliability, scalability, fault tolerance and interoperability. In this paper, we proposed a novel clustered distributed controller architecture in the real setting of SDNs. The distributed cluster implementation comprises of multiple popular SDN controllers. The proposed mechanism is evaluated using a real world network topology running on top of an emulated SDN environment. The result shows that the proposed distributed controller clustering mechanism is able to significantly reduce the average latency from 8.1% to 1.6%, the packet loss from 5.22% to 4.15%, compared to distributed controller without clustering running on HP Virtual Application Network (VAN) SDN and Open Network Operating System (ONOS) controllers respectively. Moreover, proposed method also shows reasonable CPU utilization results. Furthermore, the proposed mechanism makes possible to handle unexpected load fluctuations while maintaining a continuous network operation, even when there is a controller failure. The paper is a potential contribution stepping towards addressing the issues of reliability, scalability, fault tolerance, and inter-operability.

  19. Software Health Management with Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole; Schumann, JOhann

    2011-01-01

    Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.

  20. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  1. Simulation Of Networking Protocols On Software Emulated Network Stack

    Directory of Open Access Journals (Sweden)

    Hrushikesh Nimkar

    2015-08-01

    Full Text Available With the increasing number and complexity of network based applications the need to easy configuration development and integration of network applications has taken a high precedence. Trivial activities such as configuration can be carried out efficiently if network services are software based rather than hardware based. Project aims at enabling the network engineers to easily include network functionalities into hisher configuration and define hisher own network stack without using the kernel network stack. Having thought of this we have implemented two functionalities UPNP and MDNS. The multicast Domain Name System MDNS resolves host names to IP addresses within small ad-hoc networks and without having need of special DNS server and its configuration. MDNS application provides every host with functionality to register itself to the router make a multicast DNS request and its resolution. To make adding network devices and networked programs to a network as easy as it is to plug in a piece of hardware into a PC we make use of UPnP. The devices and programs find out about the network setup and other networked devices and programs through discovery and advertisements of services and configure themselves accordingly. UPNP application provides every host with functionality of discovering services of other hosts and serving requests on demand. To implement these applications we have used snabbswitch framework which an open source virtualized ethernet networking stack.

  2. Software defined networking applications in distributed datacenters

    CERN Document Server

    Qi, Heng

    2016-01-01

    This SpringerBrief provides essential insights on the SDN application designing and deployment in distributed datacenters. In this book, three key problems are discussed: SDN application designing, SDN deployment and SDN management. This book demonstrates how to design the SDN-based request allocation application in distributed datacenters. It also presents solutions for SDN controller placement to deploy SDN in distributed datacenters. Finally, an SDN management system is proposed to guarantee the performance of datacenter networks which are covered and controlled by many heterogeneous controllers. Researchers and practitioners alike will find this book a valuable resource for further study on Software Defined Networking. .

  3. Flexible Edge Nodes enabled by Hybrid Software Defined Optics & Networking

    DEFF Research Database (Denmark)

    Vegas Olmos, Juan José; Mehmeri, Victor; Tafur Monroy, Idelfonso

    This paper presents our vision on flexible edge nodes for future networks and our efforts to combine software defined optics and software defined networking to optimize the overall performance and user experience.......This paper presents our vision on flexible edge nodes for future networks and our efforts to combine software defined optics and software defined networking to optimize the overall performance and user experience....

  4. VennMaker for Historians: Sources, Social Networks and Software

    Directory of Open Access Journals (Sweden)

    Marten Düring

    2011-12-01

    Full Text Available This paper explores the applicability of the software VennMaker to historical research. The paper draws on two case studies from current network-oriented historical research projects, covering different time periods and sources. VennMaker’s biggest advantage is that it inverts the process of data collection. While traditional software uses pre-coded data to produce a network map, VennMaker generates data while the researcher draws nodes and creates a network map. Prefabricated data matrices are no longer necessary; therefore, the software can easily be used by historians lacking training in the social sciences. Our two cases include an analysis of a family structure in ancient history and ego-networks of Jews in hiding during National Socialism. We argue that a visual representation of social relations helps to reveal unseen patterns and characteristics of networks therefore offering scholars new perspectives on their research subjects. The software offers a variety of tools to represent social relations and their development over time and space.

  5. Evolution of a modular software network.

    Science.gov (United States)

    Fortuna, Miguel A; Bonachela, Juan A; Levin, Simon A

    2011-12-13

    "Evolution behaves like a tinkerer" (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed.

  6. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  7. A Quantum Cryptography Communication Network Based on Software Defined Network

    Directory of Open Access Journals (Sweden)

    Zhang Hongliang

    2018-01-01

    Full Text Available With the development of the Internet, information security has attracted great attention in today’s society, and quantum cryptography communication network based on quantum key distribution (QKD is a very important part of this field, since the quantum key distribution combined with one-time-pad encryption scheme can guarantee the unconditional security of the information. The secret key generated by quantum key distribution protocols is a very valuable resource, so making full use of key resources is particularly important. Software definition network (SDN is a new type of network architecture, and it separates the control plane and the data plane of network devices through OpenFlow technology, thus it realizes the flexible control of the network resources. In this paper, a quantum cryptography communication network model based on SDN is proposed to realize the flexible control of quantum key resources in the whole cryptography communication network. Moreover, we propose a routing algorithm which takes into account both the hops and the end-to-end availible keys, so that the secret key generated by QKD can be used effectively. We also simulate this quantum cryptography communication network, and the result shows that based on SDN and the proposed routing algorithm the performance of this network is improved since the effective use of the quantum key resources.

  8. Enabling software defined networking experiments in networked critical infrastructures

    Directory of Open Access Journals (Sweden)

    Béla Genge

    2014-05-01

    Full Text Available Nowadays, the fact that Networked Critical Infrastructures (NCI, e.g., power plants, water plants, oil and gas distribution infrastructures, and electricity grids, are targeted by significant cyber threats is well known. Nevertheless, recent research has shown that specific characteristics of NCI can be exploited in the enabling of more efficient mitigation techniques, while novel techniques from the field of IP networks can bring significant advantages. In this paper we explore the interconnection of NCI communication infrastructures with Software Defined Networking (SDN-enabled network topologies. SDN provides the means to create virtual networking services and to implement global networking decisions. It relies on OpenFlow to enable communication with remote devices and has been recently categorized as the “Next Big Technology”, which will revolutionize the way decisions are implemented in switches and routers. Therefore, the paper documents the first steps towards enabling an SDN-NCI and presents the impact of a Denial of Service experiment over traffic resulting from an XBee sensor network which is routed across an emulated SDN network.

  9. Fast Program Codes Dissemination for Smart Wireless Software Defined Networks

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available In smart wireless software defined networks (WSDNs, sensor nodes are deployed in the monitored area to sense data. In order to increase the flexibility of WSDNs configuration, sensor nodes use programmable technology. Thus, programming and software engineering that integrate Internet of Things (IoT lead to a smart world. Due to the large capacity of program codes and the limited energy of wireless network, only a subset of nodes is selected to spread program codes, and the remaining nodes are in sleep status to save energy. In this paper, a fast program codes dissemination (FPCD scheme for smart wireless software defined networking is proposed; many nodes in the area far from the sink will be selected to spread program codes; those areas have much energy left, while the area near the sink chooses less number of active nodes to spread program codes to save energy. Thus, FPCD scheme can reduce delay for spreading program codes while retaining network lifetime. The theoretical analysis and experimental results show that our approach can reduce transmission delay by 10.76%–105.791% while retaining network lifetime compares with previous broadcast schemes.

  10. A modular attachment mechanism for software network evolution

    Science.gov (United States)

    Li, Hui; Zhao, Hai; Cai, Wei; Xu, Jiu-Qiang; Ai, Jun

    2013-05-01

    A modular attachment mechanism of software network evolution is presented in this paper. Compared with the previous models, our treatment of object-oriented software system as a network of modularity is inherently more realistic. To acquire incoming and outgoing links in directed networks when new nodes attach to the existing network, a new definition of asymmetric probabilities is given. Based on this, modular attachment instead of single node attachment in the previous models is then adopted. The proposed mechanism is demonstrated to be able to generate networks with features of power-law, small-world, and modularity, which represents more realistic properties of actual software networks. This work therefore contributes to a more accurate understanding of the evolutionary mechanism of software systems. What is more, explorations of the effects of various software development principles on the structure of software systems have been carried out, which are expected to be beneficial to the software engineering practices.

  11. Software Comparison for Renewable Energy Deployment in a Distribution Network

    Energy Technology Data Exchange (ETDEWEB)

    Gao, David Wenzhong [Alternative Power Innovations, LLC, Sharonville, OH (United States); Muljadi, Eduard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tian, Tian [National Renewable Energy Lab. (NREL), Golden, CO (United States); Miller, Mackay [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-02-22

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercial tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.

  12. Use of Time-Frequency Analysis and Neural Networks for Mode Identification in a Wireless Software-Defined Radio Approach

    Directory of Open Access Journals (Sweden)

    Matteo Gandetto

    2004-09-01

    Full Text Available The use of time-frequency distributions is proposed as a nonlinear signal processing technique that is combined with a pattern recognition approach to identify superimposed transmission modes in a reconfigurable wireless terminal based on software-defined radio techniques. In particular, a software-defined radio receiver is described aiming at the identification of two coexistent communication modes: frequency hopping code division multiple access and direct sequence code division multiple access. As a case study, two standards, based on the previous modes and operating in the same band (industrial, scientific, and medical, are considered: IEEE WLAN 802.11b (direct sequence and Bluetooth (frequency hopping. Neural classifiers are used to obtain identification results. A comparison between two different neural classifiers is made in terms of relative error frequency.

  13. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  14. Software-Programmed Optical Networking with Integrated NFV Service Provisioning

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Wang, Xi; Basu, Shrutarshi

    2017-01-01

    We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS).......We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS)....

  15. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  16. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T.; Nagao, T.; Takahashi, K. [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  18. Software architecture for hybrid electrical/optical data center network

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    2016-01-01

    This paper presents hardware and software architecture based on Software-Defined Networking (SDN) paradigm and OpenFlow/NETCONF protocols for enabling topology management of hybrid electrical/optical switching data center networks. In particular, a development on top of SDN open-source controller...

  19. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  20. Towards Software Health Management with Bayesian Networks

    Data.gov (United States)

    National Aeronautics and Space Administration — As software and software intensive systems are becoming increasingly ubiquitous, the impact of failures can be tremendous. In some industries such as aerospace,...

  1. Spotlight-8 Image Analysis Software

    Science.gov (United States)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  2. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  3. Pandora Operation and Analysis Software

    Science.gov (United States)

    Herman, Jay; Cede, Alexander; Abuhassan, Nader

    2012-01-01

    Pandora Operation and Analysis Software controls the Pandora Sun- and sky-pointing optical head and built-in filter wheels (neutral density, UV bandpass, polarization filters, and opaque). The software also controls the attached spectrometer exposure time and thermoelectric cooler to maintain the spectrometer temperature to within 1 C. All functions are available through a GUI so as to be easily accessible by the user. The data are automatically stored on a miniature computer (netbook) for automatic download to a designated server at user defined intervals (once per day, once per week, etc.), or to a USB external device. An additional software component reduces the raw data (spectrometer counts) to preliminary scientific products for quick-view purposes. The Pandora systems are built from off-the-shelf commercial parts and from mechanical parts machined using electronic machine shop drawings. The Pandora spectrometer system is designed to look at the Sun (tracking to within 0.1 ), or to look at the sky at any zenith or azimuth angle, to gather information about the amount of trace gases or aerosols that are present.

  4. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  5. A research on the application of software defined networking in satellite network architecture

    Science.gov (United States)

    Song, Huan; Chen, Jinqiang; Cao, Suzhi; Cui, Dandan; Li, Tong; Su, Yuxing

    2017-10-01

    Software defined network is a new type of network architecture, which decouples control plane and data plane of traditional network, has the feature of flexible configurations and is a direction of the next generation terrestrial Internet development. Satellite network is an important part of the space-ground integrated information network, while the traditional satellite network has the disadvantages of difficult network topology maintenance and slow configuration. The application of SDN technology in satellite network can solve these problems that traditional satellite network faces. At present, the research on the application of SDN technology in satellite network is still in the stage of preliminary study. In this paper, we start with introducing the SDN technology and satellite network architecture. Then we mainly introduce software defined satellite network architecture, as well as the comparison of different software defined satellite network architecture and satellite network virtualization. Finally, the present research status and development trend of SDN technology in satellite network are analyzed.

  6. Software Architecture Reliability Analysis using Failure Scenarios

    NARCIS (Netherlands)

    Tekinerdogan, B.; Sözer, Hasan; Aksit, Mehmet

    2005-01-01

    We propose a Software Architecture Reliability Analysis (SARA) approach that benefits from both reliability engineering and scenario-based software architecture analysis to provide an early reliability analysis of the software architecture. SARA makes use of failure scenarios that are prioritized

  7. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  8. Network systems security analysis

    Science.gov (United States)

    Yilmaz, Ä.°smail

    2015-05-01

    Network Systems Security Analysis has utmost importance in today's world. Many companies, like banks which give priority to data management, test their own data security systems with "Penetration Tests" by time to time. In this context, companies must also test their own network/server systems and take precautions, as the data security draws attention. Based on this idea, the study cyber-attacks are researched throughoutly and Penetration Test technics are examined. With these information on, classification is made for the cyber-attacks and later network systems' security is tested systematically. After the testing period, all data is reported and filed for future reference. Consequently, it is found out that human beings are the weakest circle of the chain and simple mistakes may unintentionally cause huge problems. Thus, it is clear that some precautions must be taken to avoid such threats like updating the security software.

  9. Malware Propagation and Prevention Model for Time-Varying Community Networks within Software Defined Networks

    Directory of Open Access Journals (Sweden)

    Lan Liu

    2017-01-01

    Full Text Available As the adoption of Software Defined Networks (SDNs grows, the security of SDN still has several unaddressed limitations. A key network security research area is in the study of malware propagation across the SDN-enabled networks. To analyze the spreading processes of network malware (e.g., viruses in SDN, we propose a dynamic model with a time-varying community network, inspired by research models on the spread of epidemics in complex networks across communities. We assume subnets of the network as communities and links that are dense in subnets but sparse between subnets. Using numerical simulation and theoretical analysis, we find that the efficiency of network malware propagation in this model depends on the mobility rate q of the nodes between subnets. We also find that there exists a mobility rate threshold qc. The network malware will spread in the SDN when the mobility rate q>qc. The malware will survive when q>qc and perish when qnetwork malware and provide a theoretical basis to reduce and prevent network security incidents.

  10. Capacity Extension of Software Defined Data Center Networks With Jellyfish Topology

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    We present a performance analysis of Jellyfish topology with Software-Defined commodity switches for Data Center networks. Our results show up to a 2-fold performance gain when compared to a Spanning Tree Protocol implementation.......We present a performance analysis of Jellyfish topology with Software-Defined commodity switches for Data Center networks. Our results show up to a 2-fold performance gain when compared to a Spanning Tree Protocol implementation....

  11. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  12. Analysis of Traffic Signals on a Software-Defined Network for Detection and Classification of a Man-in-the-Middle Attack

    Science.gov (United States)

    2017-09-01

    numerous security risks. The MITM attack proposed by Hong et al [4]. and explored in this thesis takes advantage of security risks resulting from the...Application Layer Unified Network Monitoring and Analysis Network Access Control and Bring Your Own Device Network Virtualization Security Apps Other...Business Apps Control Layer SDN Controller SDN Controller SDN Controller Westbound API Eastbound API Northbound API (REST API, etc.) Infrastructure

  13. Social Network Analysis with sna

    Directory of Open Access Journals (Sweden)

    Carter T. Butts

    2007-12-01

    Full Text Available Modern social network analysis---the analysis of relational data arising from social systems---is a computationally intensive area of research. Here, we provide an overview of a software package which provides support for a range of network analytic functionality within the R statistical computing environment. General categories of currently supported functionality are described, and brief examples of package syntax and usage are shown.

  14. Software For Management Of A Packet-Radio Network

    Science.gov (United States)

    Smyth, Patrick J.; Chauvin, Todd H.; Oliver, Gordon P.; Statman, Joseph I.

    1994-01-01

    Network-management software assists in planning, monitoring, and controlling resources of Datalink network. Packet-message network featuring time-division multiple access, frequency and spatial diversity, and dynamic tree-structured routing scheme. Developed for communication between central control station on ground and instrumented aircraft flying over test range. Aircraft derives navigational data from satellites of Global Positioning System, and primary function of Datalink network feeding GPS position data from participating aircraft into control center in real time.

  15. Semantic Security Methods for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    Ekaterina Ju. Antoshina

    2017-01-01

    Full Text Available Software-defined networking is a promising technology for constructing communication networks where the network management is the software that configures network devices. This contrasts with the traditional point of view where the network behaviour is updated by manual configuration uploading to devices under control. The software controller allows dynamic routing configuration inside the net depending on the quality of service. However, there must be a proof that ensures that every network flow is secure, for example, we can define security policy as follows: confidential nodes can not send data to the public segment of the network. The paper shows how this problem can be solved by using a semantic security model. We propose a method that allows us to construct semantics that captures necessary security properties the network must follow. This involves the specification that states allowed and forbidden network flows. The specification is then modeled as a decision tree that may be reduced. We use the decision tree for semantic construction that captures security requirements. The semantic can be implemented as a module of the controller software so the correctness of the control plane of the network can be ensured on-the-fly. 

  16. Fingerprinting Software Defined Networks and Controllers

    Science.gov (United States)

    2015-03-01

    Anonymous, “Corporate cyber-ssecurity horror movie : Hackers shine a harsh spotlight on sony,” The Economist, Dec. 2014. Available at http...DISTRIBUTION UNLIMITED The views expressed in this thesis are those of the author and do not reflect the official policy or position of the United States Air...and demonstrates the feasibility of uniquely identifying the software managing the SDN environment. With positive identification of the software

  17. Content-Centric and Software-Defined Networking with Big Data

    OpenAIRE

    Yao, Haipeng; Qiu, Chao; Fang, Chao; Chen, Xu; Yu, F. Richard

    2016-01-01

    Many communities have researched the application of novel network architectures such as Content-Centric Networking (CCN) and Software-Defined Networking (SDN) to build the future Internet. Another emerging technology which is big data analysis has also won lots of attentions from academia to industry. Many splendid researches have been done on CCN, SDN, and big data, which all have addressed separately in the traditional literature. In this paper, we propose a novel network paradigm to jointl...

  18. Software defined networking with OpenFlow

    CERN Document Server

    Azodolmolky, Siamak

    2013-01-01

    A step-by-step, example-based guide which will help you gain hands-on experience with the platforms and debugging tools on OpenFlow.If you are a network engineer, architect, junior researcher or an application developer, this book is ideal for you. You will need to have some level of network experience, knowledge of broad networking concepts, and some familiarity with day- to- day operation of computer networks. Ideally, you should also be familiar with programing scripting/languages (especially Python and Java), and system virtualization.

  19. Detecting P2P Botnet in Software Defined Networks

    Directory of Open Access Journals (Sweden)

    Shang-Chiuan Su

    2018-01-01

    Full Text Available Software Defined Network separates the control plane from network equipment and has great advantage in network management as compared with traditional approaches. With this paradigm, the security issues persist to exist and could become even worse because of the flexibility on handling the packets. In this paper we propose an effective framework by integrating SDN and machine learning to detect and categorize P2P network traffics. This work provides experimental evidence showing that our approach can automatically analyze network traffic and flexibly change flow entries in OpenFlow switches through the SDN controller. This can effectively help the network administrators manage related security problems.

  20. Software Switching for High Throughput Data Acquisition Networks

    CERN Document Server

    AUTHOR|(CDS)2089787; Lehmann Miotto, Giovanna

    The bursty many-to-one communication pattern, typical for data acquisition systems, is particularly demanding for commodity TCP/IP and Ethernet technologies. The problem arising from this pattern is widely known in the literature as \\emph{incast} and can be observed as TCP throughput collapse. It is a result of overloading the switch buffers, when a specific node in a network requests data from multiple sources. This will become even more demanding for future upgrades of the experiments at the Large Hadron Collider at CERN. It is questionable whether commodity TCP/IP and Ethernet technologies in their current form will be still able to effectively adapt to bursty traffic without losing packets due to the scarcity of buffers in the networking hardware. This thesis provides an analysis of TCP/IP performance in data acquisition networks and presents a novel approach to incast congestion in these networks based on software-based packet forwarding. Our first contribution lies in confirming the strong analogies bet...

  1. Software Delivery Risk Management: Application of Bayesian Networks in Agile Software Development

    Directory of Open Access Journals (Sweden)

    Ancveire Ieva

    2015-12-01

    Full Text Available The information technology industry cannot be imagined without large- or small-scale projects. They are implemented to develop systems enabling key business processes and improving performance and enterprise resource management. However, projects often experience various difficulties during their execution. These problems are usually related to the three objectives of the project – costs, quality and deadline. A way these challenges can be solved is project risk management. However, not always the main problems and their influencing factors can be easily identified. Usually there is a need for a more profound analysis of the problem situation. In this paper, we propose the use of a Bayesian Network concept for quantitative risk management in agile projects. The Bayesian Network is explored using a case study focusing on a project that faces difficulties during the software delivery process. We explain why an agile risk analysis is needed and assess the potential risk factors, which may occur during the project. Thereafter, we design the Bayesian Network to capture the actual problem situation and make suggestions how to improve the delivery process based on the measures to be taken to reduce the occurrence of project risks.

  2. Network coded software defined networking: enabling 5G transmission and storage networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Lucani Rötter, Daniel Enrique

    2015-01-01

    Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm....... The inherent flexibility of both SDN and NC provides fertile ground to envision more efficient, robust, and secure networking designs, which may also incorporate content caching and storage, all of which are key challenges of the upcoming 5G networks. This article not only proposes the fundamentals...

  3. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  4. Mitigating the controller performance bottlenecks in Software Defined Networks

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Soler, José

    2016-01-01

    The centralization of the control plane decision logic in Software Defined Networking (SDN) has raised concerns regarding the performance of the SDN Controller (SDNC) when the network scales up. A number of solutions have been proposed in the literature to address these concerns. This paper...

  5. FALCON: a software package for analysis of nestedness in bipartite networks [v1; ref status: indexed, http://f1000r.es/3z8

    Directory of Open Access Journals (Sweden)

    Stephen J. Beckett

    2014-08-01

    Full Text Available Nestedness is a statistical measure used to interpret bipartite interaction data in several ecological and evolutionary contexts, e.g. biogeography (species-site relationships and species interactions (plant-pollinator and host-parasite networks. Multiple methods have been used to evaluate nestedness, which differ in how the metrics for nestedness are determined. Furthermore, several different null models have been used to calculate statistical significance of nestedness scores. The profusion of measures and null models, many of which give conflicting results, is problematic for comparison of nestedness across different studies. We developed the FALCON software package to allow easy and efficient comparison of nestedness scores and statistical significances for a given input network, using a selection of the more popular measures and null models from the current literature. FALCON currently includes six measures and five null models for nestedness in binary networks, and two measures and four null models for nestedness in weighted networks. The FALCON software is designed to be efficient and easy to use. FALCON code is offered in three languages (R, MATLAB, Octave and is designed to be modular and extensible, enabling users to easily expand its functionality by adding further measures and null models. FALCON provides a robust methodology for comparing the strength and significance of nestedness in a given bipartite network using multiple measures and null models. It includes an “adaptive ensemble” method to reduce undersampling of the null distribution when calculating statistical significance. It can work with binary or weighted input networks. FALCON is a response to the proliferation of different nestedness measures and associated null models in the literature. It allows easy and efficient calculation of nestedness scores and statistical significances using different methods, enabling comparison of results from

  6. Federated software defined network operations for LHC experiments

    Science.gov (United States)

    Kim, Dongkyun; Byeon, Okhwan; Cho, Kihyeon

    2013-09-01

    The most well-known high-energy physics collaboration, the Large Hadron Collider (LHC), which is based on e-Science, has been facing several challenges presented by its extraordinary instruments in terms of the generation, distribution, and analysis of large amounts of scientific data. Currently, data distribution issues are being resolved by adopting an advanced Internet technology called software defined networking (SDN). Stability of the SDN operations and management is demanded to keep the federated LHC data distribution networks reliable. Therefore, in this paper, an SDN operation architecture based on the distributed virtual network operations center (DvNOC) is proposed to enable LHC researchers to assume full control of their own global end-to-end data dissemination. This may achieve an enhanced data delivery performance based on data traffic offloading with delay variation. The evaluation results indicate that the overall end-to-end data delivery performance can be improved over multi-domain SDN environments based on the proposed federated SDN/DvNOC operation framework.

  7. Future Scenarios for Software-Defined Metro and Access Networks and Software-Defined Photonics

    Directory of Open Access Journals (Sweden)

    Tommaso Muciaccia

    2017-01-01

    Full Text Available In recent years, architectures, devices, and components in telecommunication networks have been challenged by evolutionary and revolutionary factors which are drastically changing the traffic features. Most of these changes imply the need for major re-configurability and programmability not only in data-centers and core networks, but also in the metro-access segment. In a wide variety of contexts, this necessity has been addressed by the proposed introduction of the innovative paradigm of software-defined networks (SDNs. Several solutions inspired by the SDN model have been recently proposed also for metro and access networks, where the adoption of a new generation of software-defined reconfigurable integrated photonic devices is highly desirable. In this paper, we review the possible future application scenarios for software-defined metro and access networks and software-defined photonics (SDP, on the base of analytics, statistics, and surveys. This work describes the reasons underpinning the presented radical change of paradigm and summarizes the most significant solutions proposed in literature, with a specific emphasis to physical-layer reconfigurable networks and a focus on both architectures and devices.

  8. Malware Propagation and Prevention Model for Time-Varying Community Networks within Software Defined Networks

    OpenAIRE

    Lan Liu; Ryan K. L. Ko; Guangming Ren; Xiaoping Xu

    2017-01-01

    As the adoption of Software Defined Networks (SDNs) grows, the security of SDN still has several unaddressed limitations. A key network security research area is in the study of malware propagation across the SDN-enabled networks. To analyze the spreading processes of network malware (e.g., viruses) in SDN, we propose a dynamic model with a time-varying community network, inspired by research models on the spread of epidemics in complex networks across communities. We assume subnets of the ne...

  9. Protocol independent transmission method in software defined optical network

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Hou, Yanfang; Qiu, Yajun; Ji, Yuefeng

    2016-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.i., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). Using a proprietary protocol or encoding format is a way to improve information security. However, the flow, which carried by proprietary protocol or code, cannot go through the traditional IP network. In addition, ultra- high-definition video transmission service once again become a hot spot. Traditionally, in the IP network, the Serial Digital Interface (SDI) signal must be compressed. This approach offers additional advantages but also bring some disadvantages such as signal degradation and high latency. To some extent, HD-SDI can also be regard as a proprietary protocol, which need transparent transmission such as optical channel. However, traditional optical networks cannot support flexible traffics . In response to aforementioned challenges for future network, one immediate solution would be to use NFV technology to abstract the network infrastructure and provide an all-optical switching topology graph for the SDN control plane. This paper proposes a new service-based software defined optical network architecture, including an infrastructure layer, a virtualization layer, a service abstract layer and an application layer. We then dwell on the corresponding service providing method in order to implement the protocol-independent transport. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit the HD-SDI signal in the software-defined optical network.

  10. Next-Generation Bioacoustic Analysis Software

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Next- Generation Bioacoustic Analysis Software David K...estimates are in one dimension (bearing), two (X-Y position), or three (X-Y- Z position), analysis software is necessary. Marine mammal acoustic data is

  11. A bridge role metric model for nodes in software networks.

    Directory of Open Access Journals (Sweden)

    Bo Li

    Full Text Available A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  12. Languages for Software-Defined Networks

    Science.gov (United States)

    2013-02-01

    Expressing policies: Frenetic offers a high-level policy language that makes it easy for programs to specify the packet-forwarding behavior of the network...by-side with rules installed by a different module. Frenetic’s query language allows programmers to express what they want to monitor, leaving the...Limiting traffic: A common idiom in SDN programming is to send the first packet of a traffic aggregate to the controller, and reactively install rules for

  13. On the Design of Energy Efficient Optical Networks with Software Defined Networking Control Across Core and Access Networks

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2013-01-01

    This paper presents a Software Defined Networking (SDN) control plane based on an overlay GMPLS control model. The SDN control platform manages optical core networks (WDM/DWDM networks) and the associated access networks (GPON networks), which makes it possible to gather global information...

  14. Designing application software in wide area network settings

    Science.gov (United States)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  15. Software structure for broadband wireless sensor network system

    Science.gov (United States)

    Kwon, Hyeokjun; Oh, Sechang; Yoon, Hargsoon; Varadan, Vijay K.

    2010-04-01

    Zigbee Sensor Network system has been investigating for monitoring and analyzing the data measured from a lot of sensors because the Zigbee Sensor Network has several advantages of low power consumption, compact size, and multi-node connection. However, it has a disadvantage not to be able to monitor the data measured from sensors at the remote area such as other room that is located at other city. This paper describes the software structure to compensate the defect with combining the Zigbee Sensor Network and wireless LAN technology for remote monitoring of measured sensor data. The software structure has both benefits of Zigbee Sensor Network and the advantage of wireless LAN. The software structure has three main software structures. The first software structure consists of the function in order to acquire the data from sensors and the second software structure is to gather the sensor data through wireless Zigbee and to send the data to Monitoring system by using wireless LAN. The second part consists of Linux packages software based on 2440 CPU (Samsung corp.), which has ARM9 core. The Linux packages include bootloader, device drivers, kernel, and applications, and the applications are TCP/IP server program, the program interfacing with Zigbee RF module, and wireless LAN program. The last part of software structure is to receive the sensor data through TCP/IP client program from Wireless Gate Unit and to display graphically measured data by using MATLAB program; the sensor data is measured on 100Hz sampling rate and the measured data has 10bit data resolution. The wireless data transmission rate per each channel is 1.6kbps.

  16. Networks at their Limits: Software, Similarity, and Continuity in Vietnam

    OpenAIRE

    Nguyen, Lilly Uyen

    2013-01-01

    This dissertation explores the social worlds of pirated software discs and free/open source software in Vietnam to describe the practices of copying, evangelizing, and translation. This dissertation also reveals the cultural logics of similarity and continuity that sustain these social worlds. Taken together, this dissertation argues that the logics of similarity and continuity are expressions of Vietnam's distance from global networks. Vietnam is currently in a period of rapid economic trans...

  17. Coordination mechanisms and software in industrial networks - a literature review

    OpenAIRE

    Mittermayer, Herwig; Rodríguez Monroy, Carlos; Pelaez Garcia, Miguel Angel

    2014-01-01

    This paper groups recent supply chain management research focused on organizational design and its software support. The classification encompasses criteria related to research methodology and content. Empirical studies from management science focus on network types and organizational fit. Novel planning algorithms and innovative coordination schemes are developed mostly in the field of operations research in order to propose new software features. Operations and production management realize...

  18. SONEP: A Software-Defined Optical Network Emulation Platform

    DEFF Research Database (Denmark)

    Azodolmolky, Siamak; Petersen, Martin Nordal; Fagertun, Anna Manolova

    2014-01-01

    the lightweight system virtualization, which is recently supported in modern operating systems, in this work we present the architecture of a Software-Defined Network (SDN) emulation platform for transport optical networks and investigate its usage in a use-case scenario. To the best of our knowledge......, this is for the first time that an SDN-based emulation platform is proposed for modeling and performance evaluation of optical networks. Coupled with recent trend of extension of SDN towards transport (optical) networks, the presented tool can facilitate the evaluation of innovative idea before actual implementations...

  19. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that

  20. Considerations for Software Defined Networking (SDN): Approaches and use cases

    Science.gov (United States)

    Bakshi, K.

    Software Defined Networking (SDN) is an evolutionary approach to network design and functionality based on the ability to programmatically modify the behavior of network devices. SDN uses user-customizable and configurable software that's independent of hardware to enable networked systems to expand data flow control. SDN is in large part about understanding and managing a network as a unified abstraction. It will make networks more flexible, dynamic, and cost-efficient, while greatly simplifying operational complexity. And this advanced solution provides several benefits including network and service customizability, configurability, improved operations, and increased performance. There are several approaches to SDN and its practical implementation. Among them, two have risen to prominence with differences in pedigree and implementation. This paper's main focus will be to define, review, and evaluate salient approaches and use cases of the OpenFlow and Virtual Network Overlay approaches to SDN. OpenFlow is a communication protocol that gives access to the forwarding plane of a network's switches and routers. The Virtual Network Overlay relies on a completely virtualized network infrastructure and services to abstract the underlying physical network, which allows the overlay to be mobile to other physical networks. This is an important requirement for cloud computing, where applications and associated network services are migrated to cloud service providers and remote data centers on the fly as resource demands dictate. The paper will discuss how and where SDN can be applied and implemented, including research and academia, virtual multitenant data center, and cloud computing applications. Specific attention will be given to the cloud computing use case, where automated provisioning and programmable overlay for scalable multi-tenancy is leveraged via the SDN approach.

  1. Minimizing communication cost among distributed controllers in software defined networks

    Science.gov (United States)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  2. UPGRADE FOR HARDWARE/SOFTWARE SERVER AND NETWORK TOPOLOGY IN INFORMATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Oleksii O. Kaplun

    2011-02-01

    Full Text Available The network modernization, educational information systems software and hardware updates problem is actual in modern term of information technologies prompt development. There are server applications and network topology of Institute of Information Technology and Learning Tools of National Academy of Pedagogical Sciences of Ukraine analysis and their improvement methods expound in the article. The article materials represent modernization results implemented to increase network efficiency and reliability, decrease response time in Institute’s network information systems. The article gives diagrams of network topology before upgrading and after finish of optimization and upgrading processes.

  3. Graphs for information security control in software defined networks

    Science.gov (United States)

    Grusho, Alexander A.; Abaev, Pavel O.; Shorgin, Sergey Ya.; Timonina, Elena E.

    2017-07-01

    Information security control in software defined networks (SDN) is connected with execution of the security policy rules regulating information accesses and protection against distribution of the malicious code and harmful influences. The paper offers a representation of a security policy in the form of hierarchical structure which in case of distribution of resources for the solution of tasks defines graphs of admissible interactions in a networks. These graphs define commutation tables of switches via the SDN controller.

  4. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  5. The advent of failure analysis software technology

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, C.L. [Sandia National Labs., Albuquerque, NM (United States); Barnard, R.D. [Schlumberger Technologies, San Jose, CA (United States)

    1994-02-01

    The increasing complexity of integrated circuits demands that software tools, in addition to hardware tools, be used for successful diagnosis of failure. A series of customizable software tools have been developed that organize failure analysis information and provide expert level help to failure analysts to increase their productivity and success.

  6. In silico Biochemical Reaction Network Analysis (IBRENA): a package for simulation and analysis of reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2008-04-15

    We present In silico Biochemical Reaction Network Analysis (IBRENA), a software package which facilitates multiple functions including cellular reaction network simulation and sensitivity analysis (both forward and adjoint methods), coupled with principal component analysis, singular-value decomposition and model reduction. The software features a graphical user interface that aids simulation and plotting of in silico results. While the primary focus is to aid formulation, testing and reduction of theoretical biochemical reaction networks, the program can also be used for analysis of high-throughput genomic and proteomic data. The software package, manual and examples are available at http://www.eng.buffalo.edu/~neel/ibrena

  7. Software Updating in Wireless Sensor Networks: A Survey and Lacunae

    Directory of Open Access Journals (Sweden)

    Cormac J. Sreenan

    2013-11-01

    Full Text Available Wireless Sensor Networks are moving out of the laboratory and into the field. For a number of reasons there is often a need to update sensor node software, or node configuration, after deployment. The need for over-the-air updates is driven both by the scale of deployments, and by the remoteness and inaccessibility of sensor nodes. This need has been recognized since the early days of sensor networks, and research results from the related areas of mobile networking and distributed systems have been applied to this area. In order to avoid any manual intervention, the update process needs to be autonomous. This paper presents a comprehensive survey of software updating in Wireless Sensor Networks, and analyses the features required to make these updates autonomous. A new taxonomy of software update features and a new model for fault detection and recovery are presented. The paper concludes by identifying the lacunae relating to autonomous software updates, providing direction for future research.

  8. Software-Defined Networks as a Stage of the Network Technology Evolution

    Directory of Open Access Journals (Sweden)

    A. A. Krasotin

    2013-01-01

    Full Text Available The authors of the article focus on the concept of a software defined network. In the beginning, the brief historical account is given concerning software defined networks as a scientific concept, its formation and technological and scientific meaning. The software defined network concept is treated in the article not as the final state-of-the-art in networking, but rather as a possible step and direction in the development of a networking paradigm. The article touches on pros and cons as well of the software defined networking and gives an account of possible stages of development of this technology in the context of other technologies, considering its hybrid with MPLS, as an example. OpenFlow protocol constitutes the main part of the article. The authors further discuss various kinds of existing libraries realizing programmable management routines for a software defined network using OpenFlow. All of these libraries provide API for building modular applications for software defined network management. Touching on practical side of implementation the results of comparative tests of throughput and latency, achieved with these libraries are shown.

  9. CADDIS Volume 4. Data Analysis: Download Software

    Science.gov (United States)

    Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.

  10. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  11. State of the Art and Recent Research Advances in Software Defined Networking

    Directory of Open Access Journals (Sweden)

    Taimur Bakhshi

    2017-01-01

    Full Text Available Emerging network services and subsequent growth in the networking infrastructure have gained tremendous momentum in recent years. Application performance requiring rapid real-time network provisioning, optimized traffic management, and virtualization of shared resources has induced the conceptualization and adoption of new networking models. Software defined networking (SDN, one of the predominant and relatively new networking paradigms, seeks to simplify network management by decoupling network control logic from the underlying hardware and introduces real-time network programmability enabling innovation. The present work reviews the state of the art in software defined networking providing a historical perspective on complementary technologies in network programmability and the inherent shortcomings which paved the way for SDN. The SDN architecture is discussed along with popular protocols, platforms, and existing simulation and debugging solutions. Furthermore, a detailed analysis is presented around recent SDN development and deployment avenues ranging from mobile communications and data centers to campus networks and residential environments. The review concludes by highlighting implementation challenges and subsequent research directions being pursued in academia and industry to address issues related to application performance, control plane scalability and design, security, and interdomain connectivity in the context of SDN.

  12. Hybrid neural network approach for predicting maintainability of object-oriented software

    OpenAIRE

    Kumar, Lov; NIT Rourkela; Rath, Santanu Ku.; NIT Rourkela

    2014-01-01

    Estimation of different parameters for object-oriented systems development such as effort, quality, and risk is of major concern in software development life cycle.  Majority of the approaches available in literature for estimation are based on regression analysis and neural network techniques.  Also  it is observed that numerous software metrics are being used as input for estimation. In this study, object-oriented metrics have been considered to provide requisite input data to design the mo...

  13. Software-defined Radio Based Measurement Platform for Wireless Networks.

    Science.gov (United States)

    Chao, I-Chun; Lee, Kang B; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan

    2015-10-01

    End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc.) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks.

  14. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    From 1998 to 2001 an integrated software package for grounding and collision analysis was developed at the Technical University of Denmark within the ISESO project at the cost of six man years (0.75M US$). The software provides a toolbox for a multitude of analyses related to collision and ground......From 1998 to 2001 an integrated software package for grounding and collision analysis was developed at the Technical University of Denmark within the ISESO project at the cost of six man years (0.75M US$). The software provides a toolbox for a multitude of analyses related to collision...... and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...... route where the result is the probability density functions for the cost of oil outflow in a given area per year for the two vessels. In this paper we describe the basic modelling principles and the capabilities of the software package. The software package can be downloaded for research purposes from...

  15. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, G. A.; Dähnn, M.; Fausk, I.; Kirkvik, A. S.; Mysen, E.

    2016-12-01

    At the Norwegian Mapping Authority, we are currently developing Where, a newsoftware for geodetic analysis. Where is built on our experiences with theGeosat software, and will be able to analyse and combine data from VLBI, SLR,GNSS and DORIS. The software is mainly written in Python which has proved veryfruitful. The code is quick to write and the architecture is easily extendableand maintainable. The Python community provides a rich eco-system of tools fordoing data-analysis, including effective data storage and powerfulvisualization. Python interfaces well with other languages so that we can easilyreuse existing, well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where,including benchmarks against other software packages. In addition we will reporton some simple investigations we have done using the software, and outline ourplans for further progress.

  16. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  17. Software cost/resource modeling: Deep space network software cost estimation model

    Science.gov (United States)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  18. A Formal Model and Verification Problems for Software Defined Networks

    Directory of Open Access Journals (Sweden)

    V. A. Zakharov

    2013-01-01

    Full Text Available Software-defined networking (SDN is an approach to building computer networks that separate and abstract data planes and control planes of these systems. In a SDN a centralized controller manages a distributed set of switches. A set of open commands for packet forwarding and flow-table updating was defined in the form of a protocol known as OpenFlow. In this paper we describe an abstract formal model of SDN, introduce a tentative language for specification of SDN forwarding policies, and set up formally model-checking problems for SDN.

  19. Employing Deceptive Dynamic Network Topology Through Software-Defined Networking

    Science.gov (United States)

    2014-03-01

    onePK Beacon , Floodlight , IRIS, Jaxon, Maestro, OpenDaylight (Java) HP VAN SDN NodeFlow (JavaScript) Table 3.1: Some common vendor and opensource SDN...supports, which we will further discuss in §3.4. As well, some open-source controller projects are sponsored by major switch vendors, (e.g., Floodlight ) [63...2008, pp. 105–110. [63] Project floodlight . [Online]. Available: http://www.projectfloodlight.org/ [64] OpenFlow Switch Specification, Open Networking

  20. Enhancing the Understanding of Computer Networking Courses through Software Tools

    OpenAIRE

    Dafalla, Z. I.; Balaji, R. D.

    2015-01-01

    Computer networking is an important specialization in Information and Communication Technologies. However imparting the right knowledge to students can be a challenging task due to the fact that there is not enough time to deliver lengthy labs during normal lecture hours. Augmenting the use of physical machines with software tools help the students to learn beyond the limited lab sessions within the environment of higher Institutions of learning throughout the world. The Institutions focus mo...

  1. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  2. Software Defined Coded Networking: Benefits of the PlayNCool protocol in wireless mesh networks

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  3. Cognitive Routing in Software-Defined Underwater Acoustic Networks

    Directory of Open Access Journals (Sweden)

    Huma Ghafoor

    2017-12-01

    Full Text Available There are two different types of primary users (natural acoustic and artificial acoustic, and there is a long propagation delay for acoustic links in underwater cognitive acoustic networks (UCANs. Thus, the selection of a stable route is one of the key design factors for improving overall network stability, thereby reducing end-to-end delay. Software-defined networking (SDN is a novel approach that improves network intelligence. To this end, we propose a novel SDN-based routing protocol for UCANs in order to find a stable route between source and destination. A main controller is placed in a surface buoy that is responsible for the global view of the network, whereas local controllers are placed in different autonomous underwater vehicles (AUVs that are responsible for a localized view of the network. The AUVs have fixed trajectories, and sensor nodes within transmission range of the AUVs serve as gateways to relay the gathered information to the controllers. This is an SDN-based underwater communications scheme whereby two nodes can only communicate when they have a consensus about a common idle channel. To evaluate our proposed scheme, we perform extensive simulations and improve network performance in terms of end-to-end delay, delivery ratio, and overhead.

  4. An Empirical Study of Social Networks Metrics in Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Giulio Concas

    2010-01-01

    Full Text Available We study the application to object-oriented software of new metrics, derived from Social Network Analysis. Social Networks metrics, as for instance, the EGO metrics, allow to identify the role of each single node in the information flow through the network, being related to software modules and their dependencies. These metrics are compared with other traditional software metrics, like the Chidamber-Kemerer suite, and software graph metrics. We examine the empirical distributions of all the metrics, bugs included, across the software modules of several releases of two large Java systems, Eclipse and Netbeans. We provide analytical distribution functions suitable for describing and studying the observed distributions. We study also correlations among metrics and bugs. We found that the empirical distributions systematically show fat-tails for all the metrics. Moreover, the various metric distributions look very similar and consistent across all system releases and are also very similar in both the studied systems. These features appear to be typical properties of these software metrics.

  5. Optical Network Models and Their Application to Software-Defined Network Management

    Directory of Open Access Journals (Sweden)

    Thomas Szyrkowiec

    2017-01-01

    Full Text Available Software-defined networking is finding its way into optical networks. Here, it promises a simplification and unification of network management for optical networks allowing automation of operational tasks despite the highly diverse and vendor-specific commercial systems and the complexity and analog nature of optical transmission. Common abstractions and interfaces are a fundamental component for software-defined optical networking. Currently, a number of models for optical networks are available. They all claim to provide open and vendor agnostic management of optical equipment. In this work, we survey and compare the most important models and propose an intent interface for creating virtual topologies which is integrated in the existing model ecosystem.

  6. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  7. Software Tool for Real-Time Power Quality Analysis

    Directory of Open Access Journals (Sweden)

    CZIKER, A. C.

    2013-11-01

    Full Text Available A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recovery algorithm is applied, the harmonic analysis can be made even if voltage dips or swells appear. The virtual instrument input data can be recorded or online signals; the last ones being get through a data acquisition board. The virtual instrument was tested using both virtually created and real signals from measurements performed in distribution networks. The paper contains a numeric example made on a synthetic digital signal and an analysis made in real-time.

  8. Resilient Disaster Network Based on Software Defined Cognitive Wireless Network Technology

    Directory of Open Access Journals (Sweden)

    Goshi Sato

    2015-01-01

    Full Text Available In order to temporally recover the information network infrastructure in disaster areas from the Great East Japan Earthquake in 2011, various wireless network technologies such as satellite IP network, 3G, and Wi-Fi were effectively used. However, since those wireless networks are individually introduced and installed but not totally integrated, some of networks were congested due to the sudden network traffic generation and unbalanced traffic distribution, and eventually the total network could not effectively function. In this paper, we propose a disaster resilient network which integrates various wireless networks into a cognitive wireless network that users can use as an access network to the Internet at the serious disaster occurrence. We designed and developed the disaster resilient network based on software defined network (SDN technology to automatically select the best network link and route among the possible access networks to the Internet by periodically monitoring their network states and evaluate those using extended AHP method. In order to verify the usefulness of our proposed system, a prototype system is constructed and its performance is evaluated.

  9. Google matrix analysis of directed networks

    Science.gov (United States)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2015-10-01

    In the past decade modern societies have developed enormous communication and social networks. Their classification and information retrieval processing has become a formidable task for the society. Because of the rapid growth of the World Wide Web, and social and communication networks, new mathematical methods have been invented to characterize the properties of these networks in a more detailed and precise way. Various search engines extensively use such methods. It is highly important to develop new tools to classify and rank a massive amount of network information in a way that is adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency using various examples including the World Wide Web, Wikipedia, software architectures, world trade, social and citation networks, brain neural networks, DNA sequences, and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chains, quantum chaos, and random matrix theory.

  10. Distributed Sensor Network Software Development Testing through Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Sean M. [Univ. of New Mexico, Albuquerque, NM (United States)

    2003-12-01

    The distributed sensor network (DSN) presents a novel and highly complex computing platform with dif culties and opportunities that are just beginning to be explored. The potential of sensor networks extends from monitoring for threat reduction, to conducting instant and remote inventories, to ecological surveys. Developing and testing for robust and scalable applications is currently practiced almost exclusively in hardware. The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for DSNs independent of hardware constraints. The exibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness and scaling issues, to explore arbitrary algorithms for distributed sensors, and to defeat those algorithms through simulated failure. The user speci es the topology, the environment, the application, and any number of arbitrary failures; DSS provides the virtual environmental embedding.

  11. Enhancing Availability of Services Using Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Martin Klepac

    2015-01-01

    Full Text Available The immense growth of client requirements imposed on data centre and cloud providers results in a conflict with traditional networking concepts lacking the required agility. In order to promote flexibility, which data centre providers promise to their clients, this discrepancy needs to be resolved, for instance by employing the novel concept of Software-Defined Networking (SDN. This paper utilises this concept in order to minimise service downtime while performing live virtual machine migration. The work is aimed at small/medium-sized data centres and hence the findings are based on real communication patterns found in such environments. Results show that packet loss is slightly diminished while available throughput is increased thanks to the proactive approach taken during network topology changes when compared to the traditional approach based on L2 forwarding.

  12. GraphCrunch 2: Software tool for network modeling, alignment and clustering.

    Science.gov (United States)

    Kuchaiev, Oleksii; Stevanović, Aleksandar; Hayes, Wayne; Pržulj, Nataša

    2011-01-19

    Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, GraphCruch 2 implements an

  13. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  14. Modelling and Evaluating Software Project Risks with Quantitative Analysis Techniques in Planning Software Development

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2015-01-01

    Risk is not always avoidable, but it is controllable. The aim of this paper is to present new techniques which use the stepwise regression analysis tomodel and evaluate the risks in planning software development and reducing risk with software process improvement. Top ten software risk factors in planning software development phase and thirty control factors were presented to respondents. This study incorporates risk management approach and planning software development to mitigate software p...

  15. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Objective facial photograph analysis using imaging software.

    Science.gov (United States)

    Pham, Annette M; Tollefson, Travis T

    2010-05-01

    Facial analysis is an integral part of the surgical planning process. Clinical photography has long been an invaluable tool in the surgeon's practice not only for accurate facial analysis but also for enhancing communication between the patient and surgeon, for evaluating postoperative results, for medicolegal documentation, and for educational and teaching opportunities. From 35-mm slide film to the digital technology of today, clinical photography has benefited greatly from technological advances. With the development of computer imaging software, objective facial analysis becomes easier to perform and less time consuming. Thus, while the original purpose of facial analysis remains the same, the process becomes much more efficient and allows for some objectivity. Although clinical judgment and artistry of technique is never compromised, the ability to perform objective facial photograph analysis using imaging software may become the standard in facial plastic surgery practices in the future. Copyright 2010 Elsevier Inc. All rights reserved.

  17. Effect of anti-virus software on infectious nodes in computer network: A mathematical model

    Science.gov (United States)

    Mishra, Bimal Kumar; Pandey, Samir Kumar

    2012-07-01

    An e-epidemic model of malicious codes in the computer network through vertical transmission is formulated. We have observed that if the basic reproduction number is less than unity, the infected proportion of computer nodes disappear and malicious codes die out and also the malicious codes-free equilibrium is globally asymptotically stable which leads to its eradication. Effect of anti-virus software on the removal of the malicious codes from the computer network is critically analyzed. Analysis and simulation results show some managerial insights that are helpful for the practice of anti-virus in information sharing networks.

  18. Intraprocedural Dataflow Analysis for Software Product Lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SPLs...

  19. Software for analysis of visual meteor data

    Science.gov (United States)

    Veljković, Kristina; Ivanović, Ilija

    2014-02-01

    In this paper, we will present new software for analysis of IMO data collected from visual observations. The software consists of a package of functions written in the statistical programming language R, as well as a Java application which uses these functions in a user friendly environment. R code contains various filters for selection of data, methods for calculation of Zenithal Hourly Rate (ZHR), solar longitude, population index and graphical representation of ZHR and distribution of observed magnitudes. The Java application allows everyone to use these functions without any knowledge of R. Both R code and the Java application are open source and free with user manuals and examples provided.

  20. Texture analysis software: integration with a radiological workstation.

    Science.gov (United States)

    Duvauferrier, Régis; Bezy, Joan; Bertaud, Valérie; Toussaint, Grégoire; Morelli, John; Lasbleiz, Jeremy

    2012-01-01

    Image analysis is the daily task of radiologists. The texture of a structure or imaging finding can be more difficult to describe than other parameters. Image processing can help the radiologist in completing this difficult task. The aim of this article is to explain how we have developed texture analysis software and integrated it into a standard radiological workstation. The texture analysis method has been divided into three steps: definition of primitive elements, counting, and statistical analysis. The software was developed in C++ and integrated into a Siemens workstation with a graphical user interface. The results of analyses may be exported in Excel format. The software allows users to perform texture analyses on any type of radiological image without the need for image transfer by simply placing a region of interest. This tool has already been used to assess the trabecular network of vertebra. The integration of such software into PACS extends the applicability of texture analysis beyond that of a mere research tool and facilitates its use in routine clinical practice.

  1. Software defined multi-spectral imaging for Arctic sensor networks

    Science.gov (United States)

    Siewert, Sam; Angoth, Vivek; Krishnamurthy, Ramnarayan; Mani, Karthikeyan; Mock, Kenrick; Singh, Surjith B.; Srivistava, Saurav; Wagner, Chris; Claus, Ryan; Vis, Matthew Demi

    2016-05-01

    Availability of off-the-shelf infrared sensors combined with high definition visible cameras has made possible the construction of a Software Defined Multi-Spectral Imager (SDMSI) combining long-wave, near-infrared and visible imaging. The SDMSI requires a real-time embedded processor to fuse images and to create real-time depth maps for opportunistic uplink in sensor networks. Researchers at Embry Riddle Aeronautical University working with University of Alaska Anchorage at the Arctic Domain Awareness Center and the University of Colorado Boulder have built several versions of a low-cost drop-in-place SDMSI to test alternatives for power efficient image fusion. The SDMSI is intended for use in field applications including marine security, search and rescue operations and environmental surveys in the Arctic region. Based on Arctic marine sensor network mission goals, the team has designed the SDMSI to include features to rank images based on saliency and to provide on camera fusion and depth mapping. A major challenge has been the design of the camera computing system to operate within a 10 to 20 Watt power budget. This paper presents a power analysis of three options: 1) multi-core, 2) field programmable gate array with multi-core, and 3) graphics processing units with multi-core. For each test, power consumed for common fusion workloads has been measured at a range of frame rates and resolutions. Detailed analyses from our power efficiency comparison for workloads specific to stereo depth mapping and sensor fusion are summarized. Preliminary mission feasibility results from testing with off-the-shelf long-wave infrared and visible cameras in Alaska and Arizona are also summarized to demonstrate the value of the SDMSI for applications such as ice tracking, ocean color, soil moisture, animal and marine vessel detection and tracking. The goal is to select the most power efficient solution for the SDMSI for use on UAVs (Unoccupied Aerial Vehicles) and other drop

  2. An Initial Load-Based Green Software Defined Network

    Directory of Open Access Journals (Sweden)

    Ying Hu

    2017-05-01

    Full Text Available Software defined network (SDN is a new network architecture in which the control function is decoupled from the data forwarding plane, that is attracting wide attentions from both research and industry sectors. However, SDN still faces the energy waste problem as do traditional networks. At present, research on energy saving in SDN is mainly focused on the static optimization of the network with zero load when new traffic arrives, changing the transmission path of the uncompleted traffic which arrived before the optimization, possibly resulting in route oscillation and other deleterious effects. To avoid this, a dynamical energy saving optimization scheme in which the paths of the uncompleted flows will not be changed when new traffic arrives is designed. To find the optimal solution for energy saving, the problem is modeled as a mixed integer linear programming (MILP problem. As the high complexity of the problem prohibits the optimal solution, an improved heuristic routing algorithm called improved constant weight greedy algorithm (ICWGA is proposed to find a sub-optimal solution. Simulation results show that the energy saving capacity of ICWGA is close to that of the optimal solution, offering desirable improvement in the energy efficiency of the network.

  3. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction

    Directory of Open Access Journals (Sweden)

    P. Kumudha

    2016-01-01

    Full Text Available Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN and the novel adaptive dimensional biogeography based optimization (ADBBO model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.

  4. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction.

    Science.gov (United States)

    Kumudha, P; Venkatesan, R

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.

  5. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  6. A Benefit Analysis of Infusing Wireless into Aircraft and Fleet Operations - Report to Seedling Project Efficient Reconfigurable Cockpit Design and Fleet Operations Using Software Intensive, Network Enabled, Wireless Architecture (ECON)

    Science.gov (United States)

    Alexandrov, Natalia; Holmes, Bruce J.; Hahn, Andrew S.

    2016-01-01

    We report on an examination of potential benefits of infusing wireless technologies into various areas of aircraft and airspace operations. The analysis is done in support of a NASA seedling project Efficient Reconfigurable Cockpit Design and Fleet Operations Using Software Intensive, Network Enabled Wireless Architecture (ECON). The study has two objectives. First, we investigate one of the main benefit hypotheses of the ECON proposal: that the replacement of wired technologies with wireless would lead to significant weight reductions on an aircraft, among other benefits. Second, we advance a list of wireless technology applications and discuss their system benefits. With regard to the primary hypothesis, we conclude that the promise of weight reduction is premature. Specificity of the system domain and aircraft, criticality of components, reliability of wireless technologies, the weight of replacement or augmentation equipment, and the cost of infusion must all be taken into account among other considerations, to produce a reliable estimate of weight savings or increase.

  7. Monitoring Network and Service Availability with Open-Source Software

    Directory of Open Access Journals (Sweden)

    T. Michael Silver

    2010-03-01

    Full Text Available Silver describes the implementation of a monitoring system using an open-source software package to improve the availability of services and reduce the response time when troubles occur. He provides a brief overview of the literature available on monitoring library systems, and then describes the implementation of Nagios, an open-source network monitoring system, to monitor a regional library system’s servers and wide area network. Particular attention is paid to using the plug-in architecture to monitor library services effectively. The author includes example displays and configuration files.   Editor’s note: This article is the winner of the LITA/Ex Libris Writing Award, 2009.

  8. Software for Brain Network Simulations: A Comparative Study

    Science.gov (United States)

    Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.

    2017-01-01

    Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with

  9. Reliable and Fault-Tolerant Software-Defined Network Operations Scheme for Remote 3D Printing

    Science.gov (United States)

    Kim, Dongkyun; Gil, Joon-Min

    2015-03-01

    The recent wide expansion of applicable three-dimensional (3D) printing and software-defined networking (SDN) technologies has led to a great deal of attention being focused on efficient remote control of manufacturing processes. SDN is a renowned paradigm for network softwarization, which has helped facilitate remote manufacturing in association with high network performance, since SDN is designed to control network paths and traffic flows, guaranteeing improved quality of services by obtaining network requests from end-applications on demand through the separated SDN controller or control plane. However, current SDN approaches are generally focused on the controls and automation of the networks, which indicates that there is a lack of management plane development designed for a reliable and fault-tolerant SDN environment. Therefore, in addition to the inherent advantage of SDN, this paper proposes a new software-defined network operations center (SD-NOC) architecture to strengthen the reliability and fault-tolerance of SDN in terms of network operations and management in particular. The cooperation and orchestration between SDN and SD-NOC are also introduced for the SDN failover processes based on four principal SDN breakdown scenarios derived from the failures of the controller, SDN nodes, and connected links. The abovementioned SDN troubles significantly reduce the network reachability to remote devices (e.g., 3D printers, super high-definition cameras, etc.) and the reliability of relevant control processes. Our performance consideration and analysis results show that the proposed scheme can shrink operations and management overheads of SDN, which leads to the enhancement of responsiveness and reliability of SDN for remote 3D printing and control processes.

  10. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  11. In-network adaptation of SHVC video in software-defined networks

    Science.gov (United States)

    Awobuluyi, Olatunde; Nightingale, James; Wang, Qi; Alcaraz Calero, Jose Maria; Grecos, Christos

    2016-04-01

    Software Defined Networks (SDN), when combined with Network Function Virtualization (NFV) represents a paradigm shift in how future networks will behave and be managed. SDN's are expected to provide the underpinning technologies for future innovations such as 5G mobile networks and the Internet of Everything. The SDN architecture offers features that facilitate an abstracted and centralized global network view in which packet forwarding or dropping decisions are based on application flows. Software Defined Networks facilitate a wide range of network management tasks, including the adaptation of real-time video streams as they traverse the network. SHVC, the scalable extension to the recent H.265 standard is a new video encoding standard that supports ultra-high definition video streams with spatial resolutions of up to 7680×4320 and frame rates of 60fps or more. The massive increase in bandwidth required to deliver these U-HD video streams dwarfs the bandwidth requirements of current high definition (HD) video. Such large bandwidth increases pose very significant challenges for network operators. In this paper we go substantially beyond the limited number of existing implementations and proposals for video streaming in SDN's all of which have primarily focused on traffic engineering solutions such as load balancing. By implementing and empirically evaluating an SDN enabled Media Adaptation Network Entity (MANE) we provide a valuable empirical insight into the benefits and limitations of SDN enabled video adaptation for real time video applications. The SDN-MANE is the video adaptation component of our Video Quality Assurance Manager (VQAM) SDN control plane application, which also includes an SDN monitoring component to acquire network metrics and a decision making engine using algorithms to determine the optimum adaptation strategy for any real time video application flow given the current network conditions. Our proposed VQAM application has been implemented and

  12. Software For Multivariable Frequency-Domain Analysis

    Science.gov (United States)

    Armstrong, Ernest S.; Giesy, Daniel P.

    1991-01-01

    FREQ (Multivariable Frequency Domain Singular Value Analysis Package) software package of subroutines performing frequency-domain analysis of: continuous- or discrete-multivariable linear systems; any continuous system for which one calculates transfer matrix at points on imaginary axis; or any discrete system for which one calculates transfer matrix at points on unit circle. Four different versions available. Single-precision brief version LAR-14119, single-precision complete version LAR-14120, double-precision brief version LAR-14121, and double-precision complete version LAR-14122. Written in ANSI standard FORTRAN 77.

  13. Generalized Support Software: Domain Analysis and Implementation

    Science.gov (United States)

    Stark, Mike; Seidewitz, Ed

    1995-01-01

    For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.

  14. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  15. Software Defined Networking (SDN) controlled all optical switching networks with multi-dimensional switching architecture

    Science.gov (United States)

    Zhao, Yongli; Ji, Yuefeng; Zhang, Jie; Li, Hui; Xiong, Qianjin; Qiu, Shaofeng

    2014-08-01

    Ultrahigh throughout capacity requirement is challenging the current optical switching nodes with the fast development of data center networks. Pbit/s level all optical switching networks need to be deployed soon, which will cause the high complexity of node architecture. How to control the future network and node equipment together will become a new problem. An enhanced Software Defined Networking (eSDN) control architecture is proposed in the paper, which consists of Provider NOX (P-NOX) and Node NOX (N-NOX). With the cooperation of P-NOX and N-NOX, the flexible control of the entire network can be achieved. All optical switching network testbed has been experimentally demonstrated with efficient control of enhanced Software Defined Networking (eSDN). Pbit/s level all optical switching nodes in the testbed are implemented based on multi-dimensional switching architecture, i.e. multi-level and multi-planar. Due to the space and cost limitation, each optical switching node is only equipped with four input line boxes and four output line boxes respectively. Experimental results are given to verify the performance of our proposed control and switching architecture.

  16. ImpNet: Programming Software-Defied Networks Using Imperative Techniques

    OpenAIRE

    El-Zawawy, Mohamed A.; AlSalem, Adel I.

    2014-01-01

    Software and hardware components are basic parts of modern networks. However the software compo- nent is typical sealed and function-oriented. Therefore it is very difficult to modify these components. This badly affected networking innovations. Moreover, this resulted in network policies having complex interfaces that are not user-friendly and hence resulted in huge and complicated flow tables on physical switches of networks. This greatly degrades the network performance in many cases. Soft...

  17. Computer imaging software for profile photograph analysis.

    Science.gov (United States)

    Tollefson, Travis T; Sykes, Jonathan M

    2007-01-01

    To describe a novel calibration technique for photographs of different sizes and to test a new method of chin evaluation in relation to established analysis measurements. A photograph analysis and medical record review of 14 patients who underwent combined rhinoplasty and chin correction at an academic center. Patients undergoing concurrent orthognathic surgery, rhytidectomy, or submental liposuction were excluded. Preoperative and postoperative digital photographs were analyzed using computer imaging software with a new method, the soft tissue porion to pogonion distance, and with established measurements, including the cervicomental angle, the mentocervical angle, and the facial convexity angle. The porion to pogonion distance consistently increased after the chin correction procedure (more in the osseous group). All photograph angle measurements changed toward the established normal range postoperatively. Surgery for facial disharmony requires artistic judgment and objective evaluation. Although 3-dimensional video analysis of the face seems promising, its clinical use is limited by cost. For surgeons who use computer imaging software, analysis of profile photographs is the most valuable tool. Even when preoperative and postoperative photographs are of different sizes, relative distance comparisons are possible with a new calibration technique using the constant facial landmarks, the porion and the pupil. The porion-pogonion distance is a simple reproducible measurement that can be used along with established soft tissue measurements as a guide for profile facial analysis.

  18. Investigation, Analysis and Implementation of Open Source Mobile Communication Software

    OpenAIRE

    Paudel, Suresh

    2016-01-01

    Over the past few years, open source software has transformed the mobile communication networks. The development of VoIP technologies has enabled the migration of telco protocols and services to the IP network with help of open source software. This allows for deployment of mobile networks in rural areas with lower cost. The usage of open source GSM is very useful for developing countries which do not yet have full mobile coverage. Open source GSM allows very rapid and economical deployment o...

  19. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  20. CMS Computing Software and Analysis Challenge 2006

    Science.gov (United States)

    De Filippis, N.; CMS Collaboration

    2007-10-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  1. Using Cognitive Control in Software Defined Networking for Port Scan Detection

    Science.gov (United States)

    2017-07-01

    ARL-TR-8059 ● July 2017 US Army Research Laboratory Using Cognitive Control in Software -Defined Networking for Port Scan...Cognitive Control in Software -Defined Networking for Port Scan Detection by Vinod K Mishra Computational and Information Sciences Directorate, ARL...Technical Report 3. DATES COVERED (From - To) 15 June–31 July 2016 4. TITLE AND SUBTITLE Using Cognitive Control in Software -Defined Networking for

  2. A software platform for the analysis of dermatology images

    Science.gov (United States)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  3. Towards Effective Intra-flow Network Coding in Software Defined Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Donghai Zhu

    2016-01-01

    Full Text Available Wireless Mesh Networks (WMNs have potential to provide convenient broadband wireless Internet access to mobile users.With the support of Software-Defined Networking (SDN paradigm that separates control plane and data plane, WMNs can be easily deployed and managed. In addition, by exploiting the broadcast nature of the wireless medium and the spatial diversity of multi-hop wireless networks, intra-flow network coding has shown a greater benefit in comparison with traditional routing paradigms in data transmission for WMNs. In this paper, we develop a novel OpenCoding protocol, which combines the SDN technique with intra-flow network coding for WMNs. Our developed protocol can simplify the deployment and management of the network and improve network performance. In OpenCoding, a controller that works on the control plane makes routing decisions for mesh routers and the hop-by-hop forwarding function is replaced by network coding functions in data plane. We analyze the overhead of OpenCoding. Through a simulation study, we show the effectiveness of the OpenCoding protocol in comparison with existing schemes. Our data shows that OpenCoding outperforms both traditional routing and intra-flow network coding schemes.

  4. Dynamic optical resource allocation for mobile core networks with software defined elastic optical networking.

    Science.gov (United States)

    Zhao, Yongli; Chen, Zhendong; Zhang, Jie; Wang, Xinbo

    2016-07-25

    Driven by the forthcoming of 5G mobile communications, the all-IP architecture of mobile core networks, i.e. evolved packet core (EPC) proposed by 3GPP, has been greatly challenged by the users' demands for higher data rate and more reliable end-to-end connection, as well as operators' demands for low operational cost. These challenges can be potentially met by software defined optical networking (SDON), which enables dynamic resource allocation according to the users' requirement. In this article, a novel network architecture for mobile core network is proposed based on SDON. A software defined network (SDN) controller is designed to realize the coordinated control over different entities in EPC networks. We analyze the requirement of EPC-lightpath (EPCL) in data plane and propose an optical switch load balancing (OSLB) algorithm for resource allocation in optical layer. The procedure of establishment and adjustment of EPCLs is demonstrated on a SDON-based EPC testbed with extended OpenFlow protocol. We also evaluate the OSLB algorithm through simulation in terms of bandwidth blocking ratio, traffic load distribution, and resource utilization ratio compared with link-based load balancing (LLB) and MinHops algorithms.

  5. Software safety analysis activities during software development phases of the Microwave Limb Sounder (MLS)

    Science.gov (United States)

    Shaw, Hui-Yin; Sherif, Joseph S.

    2004-01-01

    This paper describes the MLS software safety analysis activities and documents the SSA results. The scope of this software safety effort is consistent with the MLS system safety definition and is concentrated on the software faults and hazards that may have impact on the personnel safety and the environment safety.

  6. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  7. Design, Implementation and Optimization of Innovative Internet Access Networks, based on Fog Computing and Software Defined Networking

    OpenAIRE

    Iotti, Nicola

    2017-01-01

    1. DESIGN In this dissertation we introduce a new approach to Internet access networks in public spaces, such as Wi-Fi network commonly known as Hotspot, based on Fog Computing (or Edge Computing), Software Defined Networking (SDN) and the deployment of Virtual Machines (VM) and Linux containers, on the edge of the network. In this vision we deploy specialized network elements, called Fog Nodes, on the edge of the network, able to virtualize the physical infrastructure and expose APIs to e...

  8. Performance verification of network function virtualization in software defined optical transport networks

    Science.gov (United States)

    Zhao, Yongli; Hu, Liyazhou; Wang, Wei; Li, Yajie; Zhang, Jie

    2017-01-01

    With the continuous opening of resource acquisition and application, there are a large variety of network hardware appliances deployed as the communication infrastructure. To lunch a new network application always implies to replace the obsolete devices and needs the related space and power to accommodate it, which will increase the energy and capital investment. Network function virtualization1 (NFV) aims to address these problems by consolidating many network equipment onto industry standard elements such as servers, switches and storage. Many types of IT resources have been deployed to run Virtual Network Functions (vNFs), such as virtual switches and routers. Then how to deploy NFV in optical transport networks is a of great importance problem. This paper focuses on this problem, and gives an implementation architecture of NFV-enabled optical transport networks based on Software Defined Optical Networking (SDON) with the procedure of vNFs call and return. Especially, an implementation solution of NFV-enabled optical transport node is designed, and a parallel processing method for NFV-enabled OTN nodes is proposed. To verify the performance of NFV-enabled SDON, the protocol interaction procedures of control function virtualization and node function virtualization are demonstrated on SDON testbed. Finally, the benefits and challenges of the parallel processing method for NFV-enabled OTN nodes are simulated and analyzed.

  9. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  10. Software Defined Networking for Next Generation Converged Metro-Access Networks

    Science.gov (United States)

    Ruffini, M.; Slyne, F.; Bluemm, C.; Kitsuwan, N.; McGettrick, S.

    2015-12-01

    While the concept of Software Defined Networking (SDN) has seen a rapid deployment within the data center community, its adoption in telecommunications network has progressed slowly, although the concept has been swiftly adopted by all major telecoms vendors. This paper presents a control plane architecture for SDN-driven converged metro-access networks, developed through the DISCUS European FP7 project. The SDN-based controller architecture was developed in a testbed implementation targeting two main scenarios: fast feeder fiber protection over dual-homed Passive Optical Networks (PONs) and dynamic service provisioning over a multi-wavelength PON. Implementation details and results of the experiment carried out over the second scenario are reported in the paper, showing the potential of SDN in providing assured on-demand services to end-users.

  11. 3rd International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2014-01-01

    This volume compiles the major results of conference participants from the "Third International Conference in Network Analysis" held at the Higher School of Economics, Nizhny Novgorod in May 2013, with the aim to initiate further joint research among different groups. The contributions in this book cover a broad range of topics relevant to the theory and practice of network analysis, including the reliability of complex networks, software, theory, methodology, and applications.  Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network has brought together researchers, practitioners from numerous fields such as operations research, computer science, transportation, energy, biomedicine, computational neuroscience and social sciences. In addition, new approaches and computer environments such as parallel computing, grid computing, cloud computing, and quantum computing have helped to solve large scale...

  12. Social network analysis

    NARCIS (Netherlands)

    de Nooy, W.; Crothers, C.

    2009-01-01

    Social network analysis (SNA) focuses on the structure of ties within a set of social actors, e.g., persons, groups, organizations, and nations, or the products of human activity or cognition such as web sites, semantic concepts, and so on. It is linked to structuralism in sociology stressing the

  13. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  14. An Optimal Path Computation Architecture for the Cloud-Network on Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Hyunhun Cho

    2015-05-01

    Full Text Available Legacy networks do not open the precise information of the network domain because of scalability, management and commercial reasons, and it is very hard to compute an optimal path to the destination. According to today’s ICT environment change, in order to meet the new network requirements, the concept of software-defined networking (SDN has been developed as a technological alternative to overcome the limitations of the legacy network structure and to introduce innovative concepts. The purpose of this paper is to propose the application that calculates the optimal paths for general data transmission and real-time audio/video transmission, which consist of the major services of the National Research & Education Network (NREN in the SDN environment. The proposed SDN routing computation (SRC application is designed and applied in a multi-domain network for the efficient use of resources, selection of the optimal path between the multi-domains and optimal establishment of end-to-end connections.

  15. Software Architecture Reliability Analysis using Failure Scenarios

    NARCIS (Netherlands)

    Tekinerdogan, B.; Sözer, Hasan; Aksit, Mehmet

    With the increasing size and complexity of software in embedded systems, software has now become a primary threat for the reliability. Several mature conventional reliability engineering techniques exist in literature but traditionally these have primarily addressed failures in hardware components

  16. Biochemical Network Stochastic Simulator (BioNetS: software for stochastic modeling of biochemical networks

    Directory of Open Access Journals (Sweden)

    Elston Timothy C

    2004-03-01

    Full Text Available Abstract Background Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. Results We have developed the software package Biochemical Network Stochastic Simulator (BioNetS for efficientlyand accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solvesthe appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. Conclusions We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  17. A Preliminary Survey on the Security of Software-Defined Networks

    OpenAIRE

    Akbaş, Muhammet Fatih; KARAARSLAN, Enis; GÜNGÖR, Cengiz

    2016-01-01

    The number of devices connected to theInternet is increasing, data centers are growing continuously and computernetworks are getting more complex. Traditional network management approach isbecoming more difficult and insufficient. Software-Defined Networks (SDN) is anew generation networking approach which is expected to take place of thetraditional computer networks. SDN architecture provides effective managementof the large and complex networks. Although SDN have benefits from the network s...

  18. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  19. Mapping Pedagogical Opportunities Provided by Mathematics Analysis Software

    Science.gov (United States)

    Pierce, Robyn; Stacey, Kaye

    2010-01-01

    This paper proposes a taxonomy of the pedagogical opportunities that are offered by mathematics analysis software such as computer algebra systems, graphics calculators, dynamic geometry or statistical packages. Mathematics analysis software is software for purposes such as calculating, drawing graphs and making accurate diagrams. However, its…

  20. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    Software process improvement is a necessity especially since the dynamic nature of today's hardware demands reciprocal improvements in the underlying software systems. Several process improvement models exist where organizations perform an introspective study of the current software development process and ...

  1. SUBSONIC WIND TUNNEL PERFORMANCE ANALYSIS SOFTWARE

    Science.gov (United States)

    Eckert, W. T.

    1994-01-01

    This program was developed as an aid in the design and analysis of subsonic wind tunnels. It brings together and refines previously scattered and over-simplified techniques used for the design and loss prediction of the components of subsonic wind tunnels. It implements a system of equations for determining the total pressure losses and provides general guidelines for the design of diffusers, contractions, corners and the inlets and exits of non-return tunnels. The algorithms used in the program are applicable to compressible flow through most closed- or open-throated, single-, double- or non-return wind tunnels or ducts. A comparison between calculated performance and that actually achieved by several existing facilities produced generally good agreement. Any system through which air is flowing which involves turns, fans, contractions etc. (e.g., an HVAC system) may benefit from analysis using this software. This program is an update of ARC-11138 which includes PC compatibility and an improved user interface. The method of loss analysis used by the program is a synthesis of theoretical and empirical techniques. Generally, the algorithms used are those which have been substantiated by experimental test. The basic flow-state parameters used by the program are determined from input information about the reference control section and the test section. These parameters were derived from standard relationships for compressible flow. The local flow conditions, including Mach number, Reynolds number and friction coefficient are determined for each end of each component or section. The loss in total pressure caused by each section is calculated in a form non-dimensionalized by local dynamic pressure. The individual losses are based on the nature of the section, local flow conditions and input geometry and parameter information. The loss forms for typical wind tunnel sections considered by the program include: constant area ducts, open throat ducts, contractions, constant

  2. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  3. Formal assessment instrument for ensuring the security of NASA's networks, systems and software

    Science.gov (United States)

    Gilliam, D. P.; Powell, J. D.; Sherif, J.

    2002-01-01

    To address the problem of security for NASA's networks, systems and software, NASA has funded the Jet Propulsion Lab in conjunction with UC Davis to begin work on developing a software security assessment instrument for use in the software development and maintenance life cycle.

  4. Security Policy Scheme for an Efficient Security Architecture in Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Woosik Lee

    2017-06-01

    Full Text Available In order to build an efficient security architecture, previous studies have attempted to understand complex system architectures and message flows to detect various attack packets. However, the existing hardware-based single security architecture cannot efficiently handle a complex system structure. To solve this problem, we propose a software-defined networking (SDN policy-based scheme for an efficient security architecture. The proposed scheme considers four policy functions: separating, chaining, merging, and reordering. If SDN network functions virtualization (NFV system managers use these policy functions to deploy a security architecture, they only submit some of the requirement documents to the SDN policy-based architecture. After that, the entire security network can be easily built. This paper presents information about the design of a new policy functions model, and it discusses the performance of this model using theoretical analysis.

  5. Development of a Software Based Firewall System for Computer Network Traffic Control

    Directory of Open Access Journals (Sweden)

    Ikhajamgbe OYAKHILOME

    2009-12-01

    Full Text Available The connection of an internal network to an external network such as Internet has made it vulnerable to attacks. One class of network attack is unauthorized penetration into network due to the openness of networks. It is possible for hackers to sum access to an internal network, this pose great danger to the network and network resources. Our objective and major concern of network design was to build a secured network, based on software firewall that ensured the integrity and confidentiality of information on the network. We studied several mechanisms to achieve this; one of such mechanism is the implementation of firewall system as a network defence. Our developed firewall has the ability to determine which network traffic should be allowed in or out of the network. Part of our studied work was also channelled towards a comprehensive study of hardware firewall security system with the aim of developing this software based firewall system. Our software firewall goes a long way in protecting an internal network from external unauthorized traffic penetration. We included an anti virus software which is lacking in most firewalls.

  6. Cross-layer restoration with software defined networking based on IP over optical transport networks

    Science.gov (United States)

    Yang, Hui; Cheng, Lei; Deng, Junni; Zhao, Yongli; Zhang, Jie; Lee, Young

    2015-10-01

    The IP over optical transport network is a very promising networking architecture applied to the interconnection of geographically distributed data centers due to the performance guarantee of low delay, huge bandwidth and high reliability at a low cost. It can enable efficient resource utilization and support heterogeneous bandwidth demands in highly-available, cost-effective and energy-effective manner. In case of cross-layer link failure, to ensure a high-level quality of service (QoS) for user request after the failure becomes a research focus. In this paper, we propose a novel cross-layer restoration scheme for data center services with software defined networking based on IP over optical network. The cross-layer restoration scheme can enable joint optimization of IP network and optical network resources, and enhance the data center service restoration responsiveness to the dynamic end-to-end service demands. We quantitatively evaluate the feasibility and performances through the simulation under heavy traffic load scenario in terms of path blocking probability and path restoration latency. Numeric results show that the cross-layer restoration scheme improves the recovery success rate and minimizes the overall recovery time.

  7. DETECTION OF MALICIOUS SOFTWARE USING CLASSICAL AND NEURAL NETWORK CLASSIFICATION METHODS

    Directory of Open Access Journals (Sweden)

    S. V. Zhernakov

    2015-01-01

    Full Text Available Formulation of the problem: the spectrum of problems solved by modern mobile systems such as Android is constantly growing. This is because on the one hand by the potential opportunities that are implemented in hardware, as well as their integration with modern information technologies, which in turn harmoniously complement and create powerful ardware and software information systems, capable of performing many functions, including pro- information boards. Increasing the flow of information, complexity of the processes and of the hardware and software component devices such as Android, forcing developers to create new means of protection, efficiency and qualitative performing the process. This is especially important in the development of automated systems instrumental performing classification (clustering of existing software into two classes: safe and malicious software. The aim is to increase the reliability and quality of recognition of modern built-in security of information, as well as the rationale and the selection methods of carrying out these functions. The methods used are: to accomplish the goals are analyzed and used classical methods of classification, neural network method based on standard architectures, and support vector machine (SVM - machine. Novelty: The paper presents the concept of the use of support vector in identifying deleterious software developed methodological, algorithmic and software that implements this concept in relation to the means of mobile communication. Result: The obtained qualitative and quantitative characteristics-security software. Practical value: the technique of development of advanced information security systems in mobile environments such as Android. It presents an approach to the description of behavioral malware (based on the following virus: none - wakes - Analysis of weaknesses - the action: a healthy regime or attack (threat.

  8. Customer configuration updating in a software supply network

    NARCIS (Netherlands)

    Jansen, S.R.L.

    2007-01-01

    Product software development is the activity of development, modification, reuse, re-engineering, maintenance, or any other activities that result in packaged configurations of software components or software-based services that are released for and traded in a specific market \\cite{XuBrinkkemper}.

  9. ICCE Policy Statement on Network and Multiple Machine Software.

    Science.gov (United States)

    International Council for Computers in Education, Eugene, OR.

    Designed to provide educators with guidance for the lawful reproduction of computer software, this document contains suggested guidelines, sample forms, and several short articles concerning software copyright and license agreements. The initial policy statement calls for educators to provide software developers (or their agents) with a…

  10. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  11. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    Science.gov (United States)

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  12. Simulasi Virtual Local Area Network (VLAN Berbasis Software Defined Network (SDN Menggunakan POX Controller

    Directory of Open Access Journals (Sweden)

    Rohmat Tulloh

    2015-11-01

    Full Text Available VLAN (Virtual LAN merupakan sebuah teknologi yang dapat mengkonfigurasi jaringan logis independen dari struktur jaringan fisik. Hasil dari penelitian sebelumnya sudah diprediksi bahwa dibutuhkan Virtual Network yang akhirnya terciptalah VLAN. Namun paradigma jaringan saat ini tidak flexible, ketergantungan terhadap vendor sangat besar karena fungsi data plane dan control plane berada dalam satu paket device. SDN (Software defined network yang merupakan salahsatu evolusi teknologi jaringan sesuai dengan tuntutan yang berkembang dimana memisahkan fungsi data plane dan control plane pada suatu perangkat. POX Controller digunakan untuk men-simulasikan dan menguji Platform SDN (Software defined network. Pada penelitian ini menggunakan Openflow versi 1.0 untuk memasang header VLAN sehingga penelitian ini difokuskan untuk mengevaluasi performa forwarding VLAN yang memanfaatkan Openflow sebagai control plane dapat berfungsi dengan baik. Hasil penelitian ini mengusulkan penerapan karakteristik teknologi VLAN pada SDN karena telah berjalan dengan benar sesuai hasil pengujian konektifitas, verifikasi dan keamanan. Kemudian hasil pengujian lanjutan untuk melihat pengaruh SDN dengan skenario penambahan jumlah VLAN ID didapatkan bahwa set-up time akan bertambah seiring meningkatnya jumlah host dan dengan menggunakan protokol OpenFlow, latency yang terjadi di jaringan dapat dipantau dengan parameter round trip time (RTT yang stabil direntang 0,2 sampai 6 second walaupun jumlah vlan_id dan background traffic bertambah.

  13. Software Defined Networking for Improved Wireless Sensor Network Management: A Survey

    Science.gov (United States)

    Ndiaye, Musa; Hancke, Gerhard P.; Abu-Mahfouz, Adnan M.

    2017-01-01

    Wireless sensor networks (WSNs) are becoming increasingly popular with the advent of the Internet of things (IoT). Various real-world applications of WSNs such as in smart grids, smart farming and smart health would require a potential deployment of thousands or maybe hundreds of thousands of sensor nodes/actuators. To ensure proper working order and network efficiency of such a network of sensor nodes, an effective WSN management system has to be integrated. However, the inherent challenges of WSNs such as sensor/actuator heterogeneity, application dependency and resource constraints have led to challenges in implementing effective traditional WSN management. This difficulty in management increases as the WSN becomes larger. Software Defined Networking (SDN) provides a promising solution in flexible management WSNs by allowing the separation of the control logic from the sensor nodes/actuators. The advantage with this SDN-based management in WSNs is that it enables centralized control of the entire WSN making it simpler to deploy network-wide management protocols and applications on demand. This paper highlights some of the recent work on traditional WSN management in brief and reviews SDN-based management techniques for WSNs in greater detail while drawing attention to the advantages that SDN brings to traditional WSN management. This paper also investigates open research challenges in coming up with mechanisms for flexible and easier SDN-based WSN configuration and management. PMID:28471390

  14. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks

    Directory of Open Access Journals (Sweden)

    Shibo Luo

    2015-12-01

    Full Text Available Software-Defined Networking-based Mobile Networks (SDN-MNs are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.

  15. Software Defined Networking for Improved Wireless Sensor Network Management: A Survey.

    Science.gov (United States)

    Ndiaye, Musa; Hancke, Gerhard P; Abu-Mahfouz, Adnan M

    2017-05-04

    Wireless sensor networks (WSNs) are becoming increasingly popular with the advent of the Internet of things (IoT). Various real-world applications of WSNs such as in smart grids, smart farming and smart health would require a potential deployment of thousands or maybe hundreds of thousands of sensor nodes/actuators. To ensure proper working order and network efficiency of such a network of sensor nodes, an effective WSN management system has to be integrated. However, the inherent challenges of WSNs such as sensor/actuator heterogeneity, application dependency and resource constraints have led to challenges in implementing effective traditional WSN management. This difficulty in management increases as the WSN becomes larger. Software Defined Networking (SDN) provides a promising solution in flexible management WSNs by allowing the separation of the control logic from the sensor nodes/actuators. The advantage with this SDN-based management in WSNs is that it enables centralized control of the entire WSN making it simpler to deploy network-wide management protocols and applications on demand. This paper highlights some of the recent work on traditional WSN management in brief and reviews SDN-based management techniques for WSNs in greater detail while drawing attention to the advantages that SDN brings to traditional WSN management. This paper also investigates open research challenges in coming up with mechanisms for flexible and easier SDN-based WSN configuration and management.

  16. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks.

    Science.gov (United States)

    Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua

    2015-12-17

    Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.

  17. Software Defined Networking for Improved Wireless Sensor Network Management: A Survey

    Directory of Open Access Journals (Sweden)

    Musa Ndiaye

    2017-05-01

    Full Text Available Wireless sensor networks (WSNs are becoming increasingly popular with the advent of the Internet of things (IoT. Various real-world applications of WSNs such as in smart grids, smart farming and smart health would require a potential deployment of thousands or maybe hundreds of thousands of sensor nodes/actuators. To ensure proper working order and network efficiency of such a network of sensor nodes, an effective WSN management system has to be integrated. However, the inherent challenges of WSNs such as sensor/actuator heterogeneity, application dependency and resource constraints have led to challenges in implementing effective traditional WSN management. This difficulty in management increases as the WSN becomes larger. Software Defined Networking (SDN provides a promising solution in flexible management WSNs by allowing the separation of the control logic from the sensor nodes/actuators. The advantage with this SDN-based management in WSNs is that it enables centralized control of the entire WSN making it simpler to deploy network-wide management protocols and applications on demand. This paper highlights some of the recent work on traditional WSN management in brief and reviews SDN-based management techniques for WSNs in greater detail while drawing attention to the advantages that SDN brings to traditional WSN management. This paper also investigates open research challenges in coming up with mechanisms for flexible and easier SDN-based WSN configuration and management.

  18. Exploratory social network analysis with Pajek. - 2nd ed.

    NARCIS (Netherlands)

    de Nooy, W.; Mrvar, A.; Batagelj, V.

    2011-01-01

    This is an extensively revised and expanded second edition of the successful textbook on social network analysis integrating theory, applications, and network analysis using Pajek. The main structural concepts and their applications in social research are introduced with exercises. Pajek software

  19. Software ecosystems analyzing and managing business networks in the software industry

    CERN Document Server

    Jansen, S; Cusumano, MA

    2013-01-01

    This book describes the state-of-the-art of software ecosystems. It constitutes a fundamental step towards an empirically based, nuanced understanding of the implications for management, governance, and control of software ecosystems. This is the first book of its kind dedicated to this emerging field and offers guidelines on how to analyze software ecosystems; methods for managing and growing; methods on transitioning from a closed software organization to an open one; and instruments for dealing with open source, licensing issues, product management and app stores. It is unique in bringing t

  20. An Analysis of Software Design Methodologies

    Science.gov (United States)

    1979-08-01

    the second can be charac- terized as "mechanical" or "algorithmic". Duncker (1945) extended this aspect of problem solving by demonstrating that a...1972. Davis, C. G., & Vick, C. R. The software development system. IEEE Transactions on Software Engineering, 1977, SE-3, 69-84. Duncker , K. On

  1. Linear Programming Approaches for Power Savings in Software-defined Networks

    NARCIS (Netherlands)

    Moghaddam, F.A.; Grosso, P.

    2016-01-01

    Software-defined networks have been proposed as a viable solution to decrease the power consumption of the networking component in data center networks. Still the question remains on which scheduling algorithms are most suited to achieve this goal. We propose 4 different linear programming

  2. Design of a stateless low-latency router architecture for green software-defined networking

    DEFF Research Database (Denmark)

    Saldaña Cercos, Silvia; Ramos, Ramon M.; Eller, Ana C. Ewald

    2015-01-01

    Expanding software defined networking (SDN) to transport networks requires new strategies to deal with the large number of flows that future core networks will have to face. New south-bound protocols within SDN have been proposed to benefit from having control plane detached from the data plane...

  3. Reference Architecture for Multi-Layer Software Defined Optical Data Center Networks

    Directory of Open Access Journals (Sweden)

    Casimer DeCusatis

    2015-09-01

    Full Text Available As cloud computing data centers grow larger and networking devices proliferate; many complex issues arise in the network management architecture. We propose a framework for multi-layer; multi-vendor optical network management using open standards-based software defined networking (SDN. Experimental results are demonstrated in a test bed consisting of three data centers interconnected by a 125 km metropolitan area network; running OpenStack with KVM and VMW are components. Use cases include inter-data center connectivity via a packet-optical metropolitan area network; intra-data center connectivity using an optical mesh network; and SDN coordination of networking equipment within and between multiple data centers. We create and demonstrate original software to implement virtual network slicing and affinity policy-as-a-service offerings. Enhancements to synchronous storage backup; cloud exchanges; and Fibre Channel over Ethernet topologies are also discussed.

  4. Monitoring and Discovery for Self-Organized Network Management in Virtualized and Software Defined Networks

    Directory of Open Access Journals (Sweden)

    Ángel Leonardo Valdivieso Caraguay

    2017-03-01

    Full Text Available This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors.

  5. Monitoring and Discovery for Self-Organized Network Management in Virtualized and Software Defined Networks.

    Science.gov (United States)

    Caraguay, Ángel Leonardo Valdivieso; Villalba, Luis Javier García

    2017-03-31

    This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors.

  6. Monitoring and Discovery for Self-Organized Network Management in Virtualized and Software Defined Networks

    Science.gov (United States)

    Valdivieso Caraguay, Ángel Leonardo; García Villalba, Luis Javier

    2017-01-01

    This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors. PMID:28362346

  7. Cloud-Centric and Logically Isolated Virtual Network Environment Based on Software-Defined Wide Area Network

    Directory of Open Access Journals (Sweden)

    Dongkyun Kim

    2017-12-01

    Full Text Available Recent development of distributed cloud environments requires advanced network infrastructure in order to facilitate network automation, virtualization, high performance data transfer, and secured access of end-to-end resources across regional boundaries. In order to meet these innovative cloud networking requirements, software-defined wide area network (SD-WAN is primarily demanded to converge distributed cloud resources (e.g., virtual machines (VMs in a programmable and intelligent manner over distant networks. Therefore, this paper proposes a logically isolated networking scheme designed to integrate distributed cloud resources to dynamic and on-demand virtual networking over SD-WAN. The performance evaluation and experimental results of the proposed scheme indicate that virtual network convergence time is minimized in two different network models such as: (1 an operating OpenFlow-oriented SD-WAN infrastructure (KREONET-S which is deployed on the advanced national research network in Korea, and (2 Mininet-based experimental and emulated networks.

  8. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  9. A Service-Oriented Approach for Dynamic Chaining of Virtual Network Functions over Multi-Provider Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    Barbara Martini

    2016-06-01

    Full Text Available Emerging technologies such as Software-Defined Networks (SDN and Network Function Virtualization (NFV promise to address cost reduction and flexibility in network operation while enabling innovative network service delivery models. However, operational network service delivery solutions still need to be developed that actually exploit these technologies, especially at the multi-provider level. Indeed, the implementation of network functions as software running over a virtualized infrastructure and provisioned on a service basis let one envisage an ecosystem of network services that are dynamically and flexibly assembled by orchestrating Virtual Network Functions even across different provider domains, thereby coping with changeable user and service requirements and context conditions. In this paper we propose an approach that adopts Service-Oriented Architecture (SOA technology-agnostic architectural guidelines in the design of a solution for orchestrating and dynamically chaining Virtual Network Functions. We discuss how SOA, NFV, and SDN may complement each other in realizing dynamic network function chaining through service composition specification, service selection, service delivery, and placement tasks. Then, we describe the architecture of a SOA-inspired NFV orchestrator, which leverages SDN-based network control capabilities to address an effective delivery of elastic chains of Virtual Network Functions. Preliminary results of prototype implementation and testing activities are also presented. The benefits for Network Service Providers are also described that derive from the adaptive network service provisioning in a multi-provider environment through the orchestration of computing and networking services to provide end users with an enhanced service experience.

  10. Networks at Their Limits: Software, Similarity, and Continuity in Vietnam

    Science.gov (United States)

    Nguyen, Lilly Uyen

    2013-01-01

    This dissertation explores the social worlds of pirated software discs and free/open source software in Vietnam to describe the practices of copying, evangelizing, and translation. This dissertation also reveals the cultural logics of similarity and continuity that sustain these social worlds. Taken together, this dissertation argues that the…

  11. Software-defined networking model for smart transformers with ISO/IEC/IEEE 21451 sensors

    Directory of Open Access Journals (Sweden)

    Longhua Guo

    2017-06-01

    Full Text Available The advanced IEC 61850 smart transformer has shown an improved performance in monitoring, controlling, and protecting the equipment in smart substations. However, heterogeneity, feasibility, and network control problems have limited the smart transformer’s performance in networks. To address these issues, a software-defined networking model was proposed using ISO/IEC/IEEE 21451 networks. An IEC-61850-based network controller was designed as a new kind of intelligent electrical device (IED. The proposed data and information models enhanced the network awareness ability and facilitated the access of smart sensors in transformer to communication networks. The performance evaluation results showed an improved efficiency.

  12. Operationalization of Software-Defined Networks (SDN) Program Review

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — U.S. federal investments in networking research and technologies deployment have fostered and accelerated the development of the Internet from its inception. It is...

  13. CAX a software for automated spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  14. Implementing Resource-aware Multicast Forwarding in Software Defined Networks

    DEFF Research Database (Denmark)

    Poderys, Justas; Sunny, Anjusha; Soler, José

    2018-01-01

    Dened Networks (SDN), all this information is available in a centralized entity - SDN network. This work proposes to utilize the SDN paradigm to perform network-resources aware multicast data routing in the SDN controller. In a prototype implementation, multicast data is routed using a modied Edmonds...

  15. Next Generation Static Software Analysis Tools (Dagstuhl Seminar 14352)

    OpenAIRE

    Cousot, Patrick; Kroening, Daniel; Sinz, Daniel

    2014-01-01

    There has been tremendous progress in static software analysis over the last years with, for example, refined abstract interpretation methods, the advent of fast decision procedures like SAT and SMT solvers, new approaches like software (bounded) model checking or CEGAR, or new problem encodings. We are now close to integrating these techniques into every programmer's toolbox. The aim of the seminar was to bring together developers of software analysis tools and algorithms, including ...

  16. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  17. Mapping Social Network to Software Architecture to Detect Structure Clashes in Agile Software Development

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos

    2007-01-01

    Software development is rarely an individual effort and generally involves teams of developers collaborating together in order to generate reliable code. Such collaborations require proper communication and regular coordination among the team members. In addition, coordination is required to sort

  18. Software Design Challenges in Time Series Prediction Systems Using Parallel Implementation of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Narayanan Manikandan

    2016-01-01

    Full Text Available Software development life cycle has been characterized by destructive disconnects between activities like planning, analysis, design, and programming. Particularly software developed with prediction based results is always a big challenge for designers. Time series data forecasting like currency exchange, stock prices, and weather report are some of the areas where an extensive research is going on for the last three decades. In the initial days, the problems with financial analysis and prediction were solved by statistical models and methods. For the last two decades, a large number of Artificial Neural Networks based learning models have been proposed to solve the problems of financial data and get accurate results in prediction of the future trends and prices. This paper addressed some architectural design related issues for performance improvement through vectorising the strengths of multivariate econometric time series models and Artificial Neural Networks. It provides an adaptive approach for predicting exchange rates and it can be called hybrid methodology for predicting exchange rates. This framework is tested for finding the accuracy and performance of parallel algorithms used.

  19. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  20. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  1. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  2. Multidimensional Analysis of Linguistic Networks

    Science.gov (United States)

    Araújo, Tanya; Banisch, Sven

    Network-based approaches play an increasingly important role in the analysis of data even in systems in which a network representation is not immediately apparent. This is particularly true for linguistic networks, which use to be induced from a linguistic data set for which a network perspective is only one out of several options for representation. Here we introduce a multidimensional framework for network construction and analysis with special focus on linguistic networks. Such a framework is used to show that the higher is the abstraction level of network induction, the harder is the interpretation of the topological indicators used in network analysis. Several examples are provided allowing for the comparison of different linguistic networks as well as to networks in other fields of application of network theory. The computation and the intelligibility of some statistical indicators frequently used in linguistic networks are discussed. It suggests that the field of linguistic networks, by applying statistical tools inspired by network studies in other domains, may, in its current state, have only a limited contribution to the development of linguistic theory.

  3. Software for Allocating Resources in the Deep Space Network

    Science.gov (United States)

    Wang, Yeou-Fang; Borden, Chester; Zendejas, Silvino; Baldwin, John

    2003-01-01

    TIGRAS 2.0 is a computer program designed to satisfy a need for improved means for analyzing the tracking demands of interplanetary space-flight missions upon the set of ground antenna resources of the Deep Space Network (DSN) and for allocating those resources. Written in Microsoft Visual C++, TIGRAS 2.0 provides a single rich graphical analysis environment for use by diverse DSN personnel, by connecting to various data sources (relational databases or files) based on the stages of the analyses being performed. Notable among the algorithms implemented by TIGRAS 2.0 are a DSN antenna-load-forecasting algorithm and a conflict-aware DSN schedule-generating algorithm. Computers running TIGRAS 2.0 can also be connected using SOAP/XML to a Web services server that provides analysis services via the World Wide Web. TIGRAS 2.0 supports multiple windows and multiple panes in each window for users to view and use information, all in the same environment, to eliminate repeated switching among various application programs and Web pages. TIGRAS 2.0 enables the use of multiple windows for various requirements, trajectory-based time intervals during which spacecraft are viewable, ground resources, forecasts, and schedules. Each window includes a time navigation pane, a selection pane, a graphical display pane, a list pane, and a statistics pane.

  4. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... attacks, and attacks launched by insiders. Finally, the perspectives for the application of the analysis techniques are discussed, thereby, coming a small step closer to providing developers with easy- to-use tools for validating the security of networking applications....

  5. Generation of Source Data for Experiments with Network Attack Detection Software

    Science.gov (United States)

    Kotenko, Igor; Chechulin, Andrey; Branitskiy, Alexander

    2017-03-01

    The paper suggests a new approach for traffic generation and the architecture of the software tool for evaluation of attack detection and response mechanisms. To assess the proposed approach the automatic network attack detection and response mechanism Threshold Random Walk (TRW) was chosen and implemented. The results of evaluation of this mechanism by the proposed software tool are presented.

  6. Using software security analysis to verify the secure socket layer (SSL) protocol

    Science.gov (United States)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  7. Transport network extensions for accessibility analysis in geographic information systems

    NARCIS (Netherlands)

    Jong, Tom de; Tillema, T.

    2005-01-01

    In many developed countries high quality digital transport networks are available for GIS based analysis. Partly this is due to the requirements of route planning software for internet and car navigation systems. Properties of these networks consist among others of road quality attributes,

  8. Combining Static Analysis and Model Checking for Software Analysis

    Science.gov (United States)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  9. Designing a Softwarized Network Deployed on a Fleet of Drones for Rural Zone Monitoring

    Directory of Open Access Journals (Sweden)

    Corrado Rametta

    2017-03-01

    Full Text Available In the last decade, the differences in the information communication technology (ICT infrastructures between urban and rural areas have registered a tremendous increase. ICT infrastructures could strongly help rural communities where many operations are time consuming, labor-intensive and expensive due to limited access and large distances to cover. One of the most attractive solutions, which is widely recognized as promising for filling this gap, is the use of drone fleets. In this context, this paper proposes a video monitoring platform as a service (VMPaaS for wide rural areas not covered by Internet access. The platform is realized with a Software-Defined Network (SDN/Network Functions Virtualization (NFV-based flying ad-hoc network (FANET, whose target is providing a flexible and dynamic connectivity backbone, and a set of drones equipped with high-resolution cameras, each transmitting a video stream of a portion of the considered area. After describing the architecture of the proposed platform, service chains to realize the video delivery service are described, and an analytical model is defined to evaluate the computational load of the platform nodes in such a way so as to allow the network orchestrator to decide the backbone drones where running the virtual functions, and the relative resources to be allocated. Numerical analysis is carried out in a case study.

  10. Fast network centrality analysis using GPUs

    Directory of Open Access Journals (Sweden)

    Shi Zhiao

    2011-05-01

    Full Text Available Abstract Background With the exploding volume of data generated by continuously evolving high-throughput technologies, biological network analysis problems are growing larger in scale and craving for more computational power. General Purpose computation on Graphics Processing Units (GPGPU provides a cost-effective technology for the study of large-scale biological networks. Designing algorithms that maximize data parallelism is the key in leveraging the power of GPUs. Results We proposed an efficient data parallel formulation of the All-Pairs Shortest Path problem, which is the key component for shortest path-based centrality computation. A betweenness centrality algorithm built upon this formulation was developed and benchmarked against the most recent GPU-based algorithm. Speedup between 11 to 19% was observed in various simulated scale-free networks. We further designed three algorithms based on this core component to compute closeness centrality, eccentricity centrality and stress centrality. To make all these algorithms available to the research community, we developed a software package gpu-fan (GPU-based Fast Analysis of Networks for CUDA enabled GPUs. Speedup of 10-50× compared with CPU implementations was observed for simulated scale-free networks and real world biological networks. Conclusions gpu-fan provides a significant performance improvement for centrality computation in large-scale networks. Source code is available under the GNU Public License (GPL at http://bioinfo.vanderbilt.edu/gpu-fan/.

  11. Software implementation of artificial neural networks in automated intelligent systems

    Directory of Open Access Journals (Sweden)

    В.П. Харченко

    2009-02-01

    Full Text Available  Application of neural networks technologies effectively decides the task of synthesis of origin of accident risk and gives out the vector of managing signals of network on incomplete and distorted information about the phenomena, events and processes which influence on safety flights.

  12. Social Software: Participants' Experience Using Social Networking for Learning

    Science.gov (United States)

    Batchelder, Cecil W.

    2010-01-01

    Social networking tools used in learning provides instructional design with tools for transformative change in education. This study focused on defining the meanings and essences of social networking through the lived common experiences of 7 college students. The problem of the study was a lack of learner voice in understanding the value of social…

  13. Network Analysis, Architecture, and Design

    CERN Document Server

    McCabe, James D

    2007-01-01

    Traditionally, networking has had little or no basis in analysis or architectural development, with designers relying on technologies they are most familiar with or being influenced by vendors or consultants. However, the landscape of networking has changed so that network services have now become one of the most important factors to the success of many third generation networks. It has become an important feature of the designer's job to define the problems that exist in his network, choose and analyze several optimization parameters during the analysis process, and then prioritize and evalua

  14. ERP Software Selection Model using Analytic Network Process

    OpenAIRE

    Lesmana , Andre Surya; Astanti, Ririn Diar; Ai, The Jin

    2014-01-01

    During the implementation of Enterprise Resource Planning (ERP) in any company, one of the most important issues is the selection of ERP software that can satisfy the needs and objectives of the company. This issue is crucial since it may affect the duration of ERP implementation and the costs incurred for the ERP implementation. This research tries to construct a model of the selection of ERP software that are beneficial to the company in order to carry out the selection of the right ERP sof...

  15. Cross-instrument Analysis Correlation Software

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-28

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog box driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.

  16. Network topology analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kalb, Jeffrey L.; Lee, David S.

    2008-01-01

    Emerging high-bandwidth, low-latency network technology has made network-based architectures both feasible and potentially desirable for use in satellite payload architectures. The selection of network topology is a critical component when developing these multi-node or multi-point architectures. This study examines network topologies and their effect on overall network performance. Numerous topologies were reviewed against a number of performance, reliability, and cost metrics. This document identifies a handful of good network topologies for satellite applications and the metrics used to justify them as such. Since often multiple topologies will meet the requirements of the satellite payload architecture under development, the choice of network topology is not easy, and in the end the choice of topology is influenced by both the design characteristics and requirements of the overall system and the experience of the developer.

  17. Performance evaluation of a Software-Defined Network (SDN) controller

    OpenAIRE

    Casas Moreno, Xavier

    2016-01-01

    The 5th generation of mobile networks (5G) will enable access to information anywhere and anytime to anyone and anything, i.e., the so-called Networked Society. The details of 5G are still the subject of ongoing research and debate, mostly focused on understanding the radio technologies that can enable the 5G vision. On the other hand, less work has been dedicated so far to address the challenges that 5G will pose to the transport network infrastructure. These challenges include: (i) the capa...

  18. Software-Defined Radio for Wireless Local-Area Networks

    NARCIS (Netherlands)

    Schiphorst, Roelof

    2004-01-01

    New wireless communications standards do not replace old ones, instead the number of standards keeps on increasing and by now an abundance of standards exists. Moreover there is no reason to assume that this trend will ever stop. Therefore, the software-radio concept is emerging as a potential

  19. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  20. APIs for QoS configuration in Software Defined Networks

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Soler, José

    2015-01-01

    such as configuration of devices, ports, queues, etc. An Application Programming Interface (API) for dynamic configuration of QoS resources in the network devices is implemented herein, by using the capabilities of OVSDB. Further, the paper demonstrates the possibility to create network services with coarse granularity...... on top of the fine granular services exposed by the QoS configuration API at the SDNC. A series of tests emphasize the capabilities and the performance of the implemented QoS configuration API....

  1. Software Tool for Real-Time Power Quality Analysis

    OpenAIRE

    CZIKER, A. C.; CHINDRIS, M. D.; Miron, A

    2013-01-01

    A software tool dedicated for the analysis of power signals containing harmonic and interharmonic components, unbalance, voltage dips and voltage swells is presented. The software tool is a virtual instrument, which uses innovative algorithms based on time and frequency domains analysis to process power signals. In order to detect the temporary disturbances, edge detection is proposed, whereas for the harmonic analysis Gaussian filter banks are implemented. Considering that a signal recov...

  2. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  3. Dynamic Construction Scheme for Virtualization Security Service in Software-Defined Networks

    OpenAIRE

    Lin, Zhaowen; Tao, Dan; Wang, Zhenji

    2017-01-01

    For a Software Defined Network (SDN), security is an important factor affecting its large-scale deployment. The existing security solutions for SDN mainly focus on the controller itself, which has to handle all the security protection tasks by using the programmability of the network. This will undoubtedly involve a heavy burden for the controller. More devastatingly, once the controller itself is attacked, the entire network will be paralyzed. Motivated by this, this paper proposes a novel s...

  4. CMIP: a software package capable of reconstructing genome-wide regulatory networks using gene expression data.

    Science.gov (United States)

    Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang

    2016-12-23

    A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .

  5. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  6. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  7. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  8. Enhanced method of fast re-routing with load balancing in software-defined networks

    Science.gov (United States)

    Lemeshko, Oleksandr; Yeremenko, Oleksandra

    2017-11-01

    A two-level method of fast re-routing with load balancing in a software-defined network (SDN) is proposed. The novelty of the method consists, firstly, in the introduction of a two-level hierarchy of calculating the routing variables responsible for the formation of the primary and backup paths, and secondly, in ensuring a balanced load of the communication links of the network, which meets the requirements of the traffic engineering concept. The method provides implementation of link, node, path, and bandwidth protection schemes for fast re-routing in SDN. The separation in accordance with the interaction prediction principle along two hierarchical levels of the calculation functions of the primary (lower level) and backup (upper level) routes allowed to abandon the initial sufficiently large and nonlinear optimization problem by transiting to the iterative solution of linear optimization problems of half the dimension. The analysis of the proposed method confirmed its efficiency and effectiveness in terms of obtaining optimal solutions for ensuring balanced load of communication links and implementing the required network element protection schemes for fast re-routing in SDN.

  9. Open architecture software platform for biomedical signal analysis.

    Science.gov (United States)

    Duque, Juliano J; Silva, Luiz E V; Murta, Luiz O

    2013-01-01

    Biomedical signals are very important reporters of the physiological status in human body. Therefore, great attention is devoted to the study of analysis methods that help extracting the greatest amount of relevant information from these signals. There are several free of charge softwares which can process biomedical data, but they are usually closed architecture, not allowing addition of new functionalities by users. This paper presents a proposal for free open architecture software platform for biomedical signal analysis, named JBioS. Implemented in Java, the platform offers some basic functionalities to load and display signals, and allows the integration of new software components through plugins. JBioS facilitates validation of new analysis methods and provides an environment for multi-methods analysis. Plugins can be developed for preprocessing, analyzing and simulating signals. Some applications have been done using this platform, suggesting that, with these features, JBioS presents itself as a software with potential applications in both research and clinical area.

  10. Computer-assisted qualitative data analysis software: a review.

    Science.gov (United States)

    Banner, Davina J; Albarrran, John W

    2009-01-01

    Over recent decades, qualitative research has become accepted as a uniquely valuable methodological approach for generating knowledge, particularly in relation to promoting understanding of patients' experiences and responses to illness. Within cardiovascular nursing such qualitative approaches have been widely adopted to systematically investigate a number of phenomena. Contemporary qualitative research practice comprises a diverse range of disciplines and approaches. Computer-aided qualitative data analysis software represents an important facet of this increasingly sophisticated movement. Such software offers an efficient means through which to manage and organize data while supporting rigorous data analysis. The increasing use of qualitative data analysis software has stimulated wide discussion. This research column includes a review of some of the advantages and debates related to the use and integration of qualitative data analysis software.

  11. Software Defined Optics and Networking for Large Scale Data Centers

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Andrus, Bogdan-Mihai; Tafur Monroy, Idelfonso

    Big data imposes correlations of large amounts of information between numerous systems and databases. This leads to large dynamically changing flows and traffic patterns between clusters and server racks that result in a decrease of the quality of transmission and degraded application performance....... Highly interconnected topologies combined with flexible, on demand network configuration can become a solution to the ever-increasing dynamic traffic...

  12. Software defined networking to improve mobility management performance

    NARCIS (Netherlands)

    Karimzadeh Motallebi Azar, Morteza; Sperotto, Anna; Pras, Aiko; Sperotto, Anna; Doyen, Guillaume; Latré, Steven; Charalambides, Marinos; Stiller, Burkhard

    2014-01-01

    n mobile networks, efficient IP mobility management is a crucial issue for the mobile users changing their mobility anchor points during handover. In this regard several mobility management methods have been proposed. However, those are insufficient for the future mobile Internet in terms of

  13. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    Science.gov (United States)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  14. Software Development Cost and Time Forecasting Using a High Performance Artificial Neural Network Model

    Science.gov (United States)

    Attarzadeh, Iman; Ow, Siew Hock

    Nowadays, mature software companies are more interested to have a precise estimation of software metrics such as project time, cost, quality, and risk at the early stages of software development process. The ability to precisely estimate project time and costs by project managers is one of the essential tasks in software development activities, and it named software effort estimation. The estimated effort at the early stage of project development process is uncertain, vague, and often the least accurate. It is because that very little information is available at the beginning stage of project. Therefore, a reliable and precise effort estimation model is an ongoing challenge for project managers and software engineers. This research work proposes a novel soft computing model incorporating Constructive Cost Model (COCOMO) to improve the precision of software time and cost estimation. The proposed artificial neural network model has good generalisation, adaption capability, and it can be interpreted and validated by software engineers. The experimental results show that applying the desirable features of artificial neural networks on the algorithmic estimation model improves the accuracy of time and cost estimation and estimated effort can be very close to the actual effort.

  15. Tourism Destinations Network Analysis, Social Network Analysis Approach

    Directory of Open Access Journals (Sweden)

    2015-09-01

    Full Text Available The tourism industry is becoming one of the world's largest economical sources, and is expected to become the world's first industry by 2020. Previous studies have focused on several aspects of this industry including sociology, geography, tourism management and development, but have paid less attention to analytical and quantitative approaches. This study introduces some network analysis techniques and measures aiming at studying the structural characteristics of tourism networks. More specifically, it presents a methodology to analyze tourism destinations network. We apply the methodology to analyze mazandaran’s Tourism destination network, one of the most famous tourism areas of Iran.

  16. Introduction to Social Network Analysis

    Science.gov (United States)

    Zaphiris, Panayiotis; Ang, Chee Siang

    Social Network analysis focuses on patterns of relations between and among people, organizations, states, etc. It aims to describe networks of relations as fully as possible, identify prominent patterns in such networks, trace the flow of information through them, and discover what effects these relations and networks have on people and organizations. Social network analysis offers a very promising potential for analyzing human-human interactions in online communities (discussion boards, newsgroups, virtual organizations). This Tutorial provides an overview of this analytic technique and demonstrates how it can be used in Human Computer Interaction (HCI) research and practice, focusing especially on Computer Mediated Communication (CMC). This topic acquires particular importance these days, with the increasing popularity of social networking websites (e.g., youtube, myspace, MMORPGs etc.) and the research interest in studying them.

  17. Higher-order neural network software for distortion invariant object recognition

    Science.gov (United States)

    Reid, Max B.; Spirkovska, Lilly

    1991-01-01

    The state-of-the-art in pattern recognition for such applications as automatic target recognition and industrial robotic vision relies on digital image processing. We present a higher-order neural network model and software which performs the complete feature extraction-pattern classification paradigm required for automatic pattern recognition. Using a third-order neural network, we demonstrate complete, 100 percent accurate invariance to distortions of scale, position, and in-plate rotation. In a higher-order neural network, feature extraction is built into the network, and does not have to be learned. Only the relatively simple classification step must be learned. This is key to achieving very rapid training. The training set is much smaller than with standard neural network software because the higher-order network only has to be shown one view of each object to be learned, not every possible view. The software and graphical user interface run on any Sun workstation. Results of the use of the neural software in autonomous robotic vision systems are presented. Such a system could have extensive application in robotic manufacturing.

  18. Method and computer product to increase accuracy of time-based software verification for sensor networks

    Science.gov (United States)

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  19. Failure mitigation in software defined networking employing load type prediction

    KAUST Repository

    Bouacida, Nader

    2017-07-31

    The controller is a critical piece of the SDN architecture, where it is considered as the mastermind of SDN networks. Thus, its failure will cause a significant portion of the network to fail. Overload is one of the common causes of failure since the controller is frequently invoked by new flows. Even through SDN controllers are often replicated, the significant recovery time can be an overkill for the availability of the entire network. In order to overcome the problem of the overloaded controller failure in SDN, this paper proposes a novel controller offload solution for failure mitigation based on a prediction module that anticipates the presence of a harmful long-term load. In fact, the long-standing load would eventually overwhelm the controller leading to a possible failure. To predict whether the load in the controller is short-term or long-term load, we used three different classification algorithms: Support Vector Machine, k-Nearest Neighbors, and Naive Bayes. Our evaluation results demonstrate that Support Vector Machine algorithm is applicable for detecting the type of load with an accuracy of 97.93% in a real-time scenario. Besides, our scheme succeeded to offload the controller by switching between the reactive and proactive mode in response to the prediction module output.

  20. Competing Compatibility Standards and Network Externalities in the PC Software Market.

    OpenAIRE

    Gandal, Neil

    1995-01-01

    This paper is an empirical study of the value of four file compatibility standards for transferring data in the personal computer software market. The results are that only the LOTUS file compatibility standard is significant in explaining price variations and it is significant in both the spreadsheet and database management system markets. This supports the hypothesis that the personal computer software market exhibits complementary network externalities. Copyright 1995 by MIT Press.

  1. Predicting Software Test Effort in Iterative Development Using a Dynamic Bayesian Network

    OpenAIRE

    Torkar, Richard; Awan, Nasir Majeed; Alvi, Adnan Khadem; Afzal, Wasif

    2010-01-01

    Projects following iterative software development methodologies must still be managed in a way as to maximize quality and minimize costs. However, there are indications that predicting test effort in iterative development is challenging and currently there seem to be no models for test effort prediction. This paper introduces and validates a dynamic Bayesian network for predicting test effort in iterative software devel- opment. The proposed model is validated by the use of data from two indu...

  2. Change impact analysis for software product lines

    Directory of Open Access Journals (Sweden)

    Jihen Maâzoun

    2016-10-01

    Full Text Available A software product line (SPL represents a family of products in a given application domain. Each SPL is constructed to provide for the derivation of new products by covering a wide range of features in its domain. Nevertheless, over time, some domain features may become obsolete with the apparition of new features while others may become refined. Accordingly, the SPL must be maintained to account for the domain evolution. Such evolution requires a means for managing the impact of changes on the SPL models, including the feature model and design. This paper presents an automated method that analyzes feature model evolution, traces their impact on the SPL design, and offers a set of recommendations to ensure the consistency of both models. The proposed method defines a set of new metrics adapted to SPL evolution to identify the effort needed to maintain the SPL models consistently and with a quality as good as the original models. The method and its tool are illustrated through an example of an SPL in the Text Editing domain. In addition, they are experimentally evaluated in terms of both the quality of the maintained SPL models and the precision of the impact change management.

  3. TopoGen: A Network Topology Generation Architecture with application to automating simulations of Software Defined Networks

    CERN Document Server

    Laurito, Andres; The ATLAS collaboration

    2017-01-01

    Simulation is an important tool to validate the performance impact of control decisions in Software Defined Networks (SDN). Yet, the manual modeling of complex topologies that may change often during a design process can be a tedious error-prone task. We present TopoGen, a general purpose architecture and tool for systematic translation and generation of network topologies. TopoGen can be used to generate network simulation models automatically by querying information available at diverse sources, notably SDN controllers. The DEVS modeling and simulation framework facilitates a systematic translation of structured knowledge about a network topology into a formal modular and hierarchical coupling of preexisting or new models of network entities (physical or logical). TopoGen can be flexibly extended with new parsers and generators to grow its scope of applicability. This permits to design arbitrary workflows of topology transformations. We tested TopoGen in a network engineering project for the ATLAS detector ...

  4. TopoGen: A Network Topology Generation Architecture with application to automating simulations of Software Defined Networks

    CERN Document Server

    Laurito, Andres; The ATLAS collaboration

    2018-01-01

    Simulation is an important tool to validate the performance impact of control decisions in Software Defined Networks (SDN). Yet, the manual modeling of complex topologies that may change often during a design process can be a tedious error-prone task. We present TopoGen, a general purpose architecture and tool for systematic translation and generation of network topologies. TopoGen can be used to generate network simulation models automatically by querying information available at diverse sources, notably SDN controllers. The DEVS modeling and simulation framework facilitates a systematic translation of structured knowledge about a network topology into a formal modular and hierarchical coupling of preexisting or new models of network entities (physical or logical). TopoGen can be flexibly extended with new parsers and generators to grow its scope of applicability. This permits to design arbitrary workflows of topology transformations. We tested TopoGen in a network engineering project for the ATLAS detector ...

  5. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis is c...

  6. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  7. Multipath protection for data center services in OpenFlow-based software defined elastic optical networks

    Science.gov (United States)

    Yang, Hui; Cheng, Lei; Yuan, Jian; Zhang, Jie; Zhao, Yongli; Lee, Young

    2015-06-01

    With the rapid growth of data center services, the elastic optical network is a very promising networking architecture to interconnect data centers because it can elastically allocate spectrum tailored for various bandwidth requirements. In case of a link failure, to ensure a high-level quality of service (QoS) for user request after the failure becomes a research focus. In light of it, in this paper, we propose and experimentally demonstrate multipath protection for data center services in OpenFlow-based software defined elastic optical network testbed aiming at improving network reliability. We first propose an OpenFlow-based software defined elastic optical network architecture for data center service protection. Then, based on the proposed architecture, multipath protection scheme is figured based on the importance level of the service. To implement the proposed scheme in the architecture, OpenFlow protocol is extended to support multipath protection in elastic optical network. The performance of our proposed multipath protection scheme is evaluated by means of experiment on our OpenFlow-based testbed. The feasibility of our proposed scheme is also demonstrated in software defined elastic optical networks.

  8. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Science.gov (United States)

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  9. Enterprise network with software Asterisk PBX based on the PLC technology

    Directory of Open Access Journals (Sweden)

    Michal Maar

    2017-01-01

    Full Text Available This article presents the software Asterisk PBX solution design in enterprise PLC network (Power Line Communication. The description of the installation and configuration of software Asterisk PBX is involved in the design. The secure interconnection of two enterprise PLC network is implemented via the telecommunication tunnel with security grant using the Cisco routers. The connection between two Asterisk PBXs is designed in context of the establishment of the tunnel. The subject of the article is also cross/connection of exchanges Asterisk PBX and hardware PBX - IP Panasonic PBX K-NS500.

  10. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  11. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  12. Development of high performance casting analysis software by coupled parallel computation

    Directory of Open Access Journals (Sweden)

    Sang Hyun CHO

    2007-08-01

    Full Text Available Up to now, so much casting analysis software has been continuing to develop the new access way to real casting processes. Those include the melt flow analysis, heat transfer analysis for solidification calculation, mechanical property predictions and microstructure predictions. These trials were successful to obtain the ideal results comparing with real situations, so that CAE technologies became inevitable to design or develop new casting processes. But for manufacturing fields, CAE technologies are not so frequently being used because of their difficulties in using the software or insufficient computing performances. To introduce CAE technologies to manufacturing field, the high performance analysis is essential to shorten the gap between product designing time and prototyping time. The software code optimization can be helpful, but it is not enough, because the codes developed by software experts are already optimized enough. As an alternative proposal for high performance computations, the parallel computation technologies are eagerly being applied to CAE technologies to make the analysis time shorter. In this research, SMP (Shared Memory Processing and MPI (Message Passing Interface (1 methods for parallelization were applied to commercial software "Z-Cast" to calculate the casting processes. In the code parallelizing processes, the network stabilization, core optimization were also carried out under Microsoft Windows platform and their performances and results were compared with those of normal linear analysis codes.

  13. Software-defined optical network for metro-scale geographically distributed data centers.

    Science.gov (United States)

    Samadi, Payman; Wen, Ke; Xu, Junjie; Bergman, Keren

    2016-05-30

    The emergence of cloud computing and big data has rapidly increased the deployment of small and mid-sized data centers. Enterprises and cloud providers require an agile network among these data centers to empower application reliability and flexible scalability. We present a software-defined inter data center network to enable on-demand scale out of data centers on a metro-scale optical network. The architecture consists of a combined space/wavelength switching platform and a Software-Defined Networking (SDN) control plane equipped with a wavelength and routing assignment module. It enables establishing transparent and bandwidth-selective connections from L2/L3 switches, on-demand. The architecture is evaluated in a testbed consisting of 3 data centers, 5-25 km apart. We successfully demonstrated end-to-end bulk data transfer and Virtual Machine (VM) migrations across data centers with less than 100 ms connection setup time and close to full link capacity utilization.

  14. Computational Social Network Analysis

    CERN Document Server

    Hassanien, Aboul-Ella

    2010-01-01

    Presents insight into the social behaviour of animals (including the study of animal tracks and learning by members of the same species). This book provides web-based evidence of social interaction, perceptual learning, information granulation and the behaviour of humans and affinities between web-based social networks

  15. Network analysis applications in hydrology

    Science.gov (United States)

    Price, Katie

    2017-04-01

    Applied network theory has seen pronounced expansion in recent years, in fields such as epidemiology, computer science, and sociology. Concurrent development of analytical methods and frameworks has increased possibilities and tools available to researchers seeking to apply network theory to a variety of problems. While water and nutrient fluxes through stream systems clearly demonstrate a directional network structure, the hydrological applications of network theory remain under­explored. This presentation covers a review of network applications in hydrology, followed by an overview of promising network analytical tools that potentially offer new insights into conceptual modeling of hydrologic systems, identifying behavioral transition zones in stream networks and thresholds of dynamical system response. Network applications were tested along an urbanization gradient in Atlanta, Georgia, USA. Peachtree Creek and Proctor Creek. Peachtree Creek contains a nest of five long­term USGS streamflow and water quality gages, allowing network application of long­term flow statistics. The watershed spans a range of suburban and heavily urbanized conditions. Summary flow statistics and water quality metrics were analyzed using a suite of network analysis techniques, to test the conceptual modeling and predictive potential of the methodologies. Storm events and low flow dynamics during Summer 2016 were analyzed using multiple network approaches, with an emphasis on tomogravity methods. Results indicate that network theory approaches offer novel perspectives for understanding long­ term and event­based hydrological data. Key future directions for network applications include 1) optimizing data collection, 2) identifying "hotspots" of contaminant and overland flow influx to stream systems, 3) defining process domains, and 4) analyzing dynamic connectivity of various system components, including groundwater­surface water interactions.

  16. Biana: a software framework for compiling biological interactions and analyzing networks.

    Science.gov (United States)

    Garcia-Garcia, Javier; Guney, Emre; Aragues, Ramon; Planas-Iglesias, Joan; Oliva, Baldo

    2010-01-27

    The analysis and usage of biological data is hindered by the spread of information across multiple repositories and the difficulties posed by different nomenclature systems and storage formats. In particular, there is an important need for data unification in the study and use of protein-protein interactions. Without good integration strategies, it is difficult to analyze the whole set of available data and its properties. We introduce BIANA (Biologic Interactions and Network Analysis), a tool for biological information integration and network management. BIANA is a Python framework designed to achieve two major goals: i) the integration of multiple sources of biological information, including biological entities and their relationships, and ii) the management of biological information as a network where entities are nodes and relationships are edges. Moreover, BIANA uses properties of proteins and genes to infer latent biomolecular relationships by transferring edges to entities sharing similar properties. BIANA is also provided as a plugin for Cytoscape, which allows users to visualize and interactively manage the data. A web interface to BIANA providing basic functionalities is also available. The software can be downloaded under GNU GPL license from http://sbi.imim.es/web/BIANA.php. BIANA's approach to data unification solves many of the nomenclature issues common to systems dealing with biological data. BIANA can easily be extended to handle new specific data repositories and new specific data types. The unification protocol allows BIANA to be a flexible tool suitable for different user requirements: non-expert users can use a suggested unification protocol while expert users can define their own specific unification rules.

  17. Software Process Models and Analysis on Failure of Software Development Projects

    OpenAIRE

    Kaur, Rupinder; Sengupta, Jyotsna

    2013-01-01

    The software process model consists of a set of activities undertaken to design, develop and maintain software systems. A variety of software process models have been designed to structure, describe and prescribe the software development process. The software process models play a very important role in software development, so it forms the core of the software product. Software project failure is often devastating to an organization. Schedule slips, buggy releases and missing features can me...

  18. Conceptual Considerations for Reducing the Computational Complexity in Software Defined Radio using Cooperative Wireless Networks

    DEFF Research Database (Denmark)

    Kristensen, Jesper Michael; Fitzek, Frank H. P.; Koch, Peter

    2005-01-01

    This paper motivates the application of Software defined radio as the enabling technology in the implementation of future wireless terminals for 4G. It outlines the advantages and disadvantages of SDR in terms of Flexibility and reconfigurability versus computational complexity. To mitigate...... the expected increase in complexity leading to a decrease in energy efficiency, cooperative wireless networks are introduced. Cooperative wireless networks enables the concept of resource sharing. Resource sharing is interpreted as collaborative signal processing. This interpretation leads to the concept...

  19. Software defined radio for cognitive wireless sensor networks : a reconfigurable IEEE 802.15.4 reconfigurable

    OpenAIRE

    Zitouni, Rafik

    2015-01-01

    The Increasing number of Wireless Sensor Networks (WSNs) applications has led industries to design the physical layer (PHY) of these networks following the IEEE 802.15.4 standard. The traditional design of that layer is on hardware suffering from a lack of flexibility of radio parameters, such as changing both frequency bands and modulations. This problem is emphasized by the scarcity of the radio-frequency spectrum. Software Defined Radio (SDR) is an attracting solution to easily reconfigure...

  20. Solving Problems in Software Applications through Data Synchronization in Case of Absence of the Network

    OpenAIRE

    Isak Shabani; Betim Cico; Agni Dika

    2012-01-01

    In this paper, we have presented an algorithm for data synchronization based on Web Services (WS), which allows software applications to work well on both configurations Online and "Offline", in the absence of the network. For this purpose is in use Electronic Student Management System (ESMS) at University of Prishtina (UP) with the appropriate module. Since the use of ESMS, because of a uncertain supply of electricity, disconnecting the network and for other reasons which are not under the c...

  1. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  2. Topological analysis of telecommunications networks

    Directory of Open Access Journals (Sweden)

    Milojko V. Jevtović

    2011-01-01

    Full Text Available A topological analysis of the structure of telecommunications networks is a very interesting topic in the network research, but also a key issue in their design and planning. Satisfying multiple criteria in terms of locations of switching nodes as well as their connectivity with respect to the requests for capacity, transmission speed, reliability, availability and cost are the main research objectives. There are three ways of presenting the topology of telecommunications networks: table, matrix or graph method. The table method is suitable for a network of a relatively small number of nodes in relation to the number of links. The matrix method involves the formation of a connection matrix in which its columns present source traffic nodes and its rows are the switching systems that belong to the destination. The method of the topology graph means that the network nodes are connected via directional or unidirectional links. We can thus easily analyze the structural parameters of telecommunications networks. This paper presents the mathematical analysis of the star-, ring-, fully connected loop- and grid (matrix-shaped topology as well as the topology based on the shortest path tree. For each of these topologies, the expressions for determining the number of branches, the middle level of reliability, the medium length and the average length of the link are given in tables. For the fully connected loop network with five nodes the values of all topological parameters are calculated. Based on the topological parameters, the relationships that represent integral and distributed indicators of reliability are given in this work as well as the values of the particular network. The main objectives of the topology optimization of telecommunications networks are: achieving the minimum complexity, maximum capacity, the shortest path message transfer, the maximum speed of communication and maximum economy. The performance of telecommunications networks is

  3. Propensity Score Analysis in R: A Software Review

    Science.gov (United States)

    Keller, Bryan; Tipton, Elizabeth

    2016-01-01

    In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…

  4. Software for Data Analysis Programming with R

    CERN Document Server

    Chambers, John

    2008-01-01

    Although statistical design is one of the oldest branches of statistics, its importance is ever increasing, especially in the face of the data flood that often faces statisticians. It is important to recognize the appropriate design, and to understand how to effectively implement it, being aware that the default settings from a computer package can easily provide an incorrect analysis. The goal of this book is to describe the principles that drive good design, paying attention to both the theoretical background and the problems arising from real experimental situations. Designs are motivated t

  5. Confirmatory Factor Analysis Alternative : Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2016-12-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  6. Software Construction and Analysis Tools for Future Space Missions

    Science.gov (United States)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  7. COSIGN – developing an optical software controlled data plane for future large-scale datacenter networks

    DEFF Research Database (Denmark)

    Galili, Michael; Kamchevska, Valerija; Fagertun, Anna Manolova

    2015-01-01

    This talk will present the work of the EU project COSIGN targeting the development of optical data plane solutions for future high-capacity datacenter networks (DCNs). Optical data planes with high capacity and high flexibility through software control are developed in order to enable a coherent...

  8. Dynamic Flow Migration for Delay Constrained Traffic in Software-Defined Networks

    NARCIS (Netherlands)

    Berger, Andre; Gross, James; Danielis, Peter; Dán, György

    2017-01-01

    Various industrial control applications have stringent end-to-end latency requirements in the order of a few milliseconds. Software-defined networking (SDN) is a promising solution in order to meet these stringent requirements under varying traffic patterns, as it enables the flexible management of

  9. Equipment Obsolescence Analysis and Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Redmond, J.; Carret, L.; Shaon, S.; Schultz, C.

    2015-07-01

    The procurement engineering resources at Nuclear Power Plants (NPPs) are experiencing increasing backlog for procurement items primarily due to the inability to order the original replacement parts. The level of effort and time required to prepare procurement packages is increasing since the number of obsolete parts are increasing exponentially. Procurement packages for obsolete components and parts are much more complex and take more time to prepare because of the need to perform equivalency evaluations, testing requirements and test acceptance criteria development, commercial grade dedication or equipment qualification, and increasing efforts to verify that no fraudulent or counterfeit parts are procured. This problem will be further compounded when NPPs pursue license renewal and approval for plant-life extension. Advanced planning and advanced knowledge of equipment obsolescence is required to allow for sufficient time to properly procure replacement parts for obsolete items. The uncertain supply chain capability due to obsolescence is a real problem and can cause a risk to reliable plant operations due to the potential for a lack of available spare parts and replacement components to support outages and unplanned component failures. Advanced notification of obsolescence is increasingly more important to ensure that adequate time and planning is scheduled to procure the proper replacement parts. A thorough analysis of Original Equipment Manufacturer (OEM) availability and inventory as well as an analysis of failure rates and usage rates is required to predict critical part needs to allow for early identification of obsolescence issues so that a planned and controlled strategy to qualify replacement equipment can be implemented. (Author)

  10. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  11. Design of energy efficient optical networks with software enabled integrated control plane

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2015-01-01

    energy consumption by proposing a new integrated control plane structure utilising Software Defined Networking technologies. The integrated control plane increases the efficiencies of exchanging control information across different network domains, while introducing new possibilities to the routing...... methods and the control over quality of service (QoS). The structure is defined as an overlay generalised multi-protocol label switching (GMPLS) control model. With the defined structure, the integrated control plane is able to gather information from different domains (i.e. optical core network...

  12. First field trial of Virtual Network Operator oriented network on demand (NoD) service provisioning over software defined multi-vendor OTN networks

    Science.gov (United States)

    Li, Yajie; Zhao, Yongli; Zhang, Jie; Yu, Xiaosong; Chen, Haoran; Zhu, Ruijie; Zhou, Quanwei; Yu, Chenbei; Cui, Rui

    2017-01-01

    A Virtual Network Operator (VNO) is a provider and reseller of network services from other telecommunications suppliers. These network providers are categorized as virtual because they do not own the underlying telecommunication infrastructure. In terms of business operation, VNO can provide customers with personalized services by leasing network infrastructure from traditional network providers. The unique business modes of VNO lead to the emergence of network on demand (NoD) services. The conventional network provisioning involves a series of manual operation and configuration, which leads to high cost in time. Considering the advantages of Software Defined Networking (SDN), this paper proposes a novel NoD service provisioning solution to satisfy the private network need of VNOs. The solution is first verified in the real software defined multi-domain optical networks with multi-vendor OTN equipment. With the proposed solution, NoD service can be deployed via online web portals in near-real time. It reinvents the customer experience and redefines how network services are delivered to customers via an online self-service portal. Ultimately, this means a customer will be able to simply go online, click a few buttons and have new services almost instantaneously.

  13. Hardware-and-software-based collective communication on the Quadrics network.

    Energy Technology Data Exchange (ETDEWEB)

    Petrini, F. (Fabrizio); Coll, S. (Salvador); Frachtemberg, E. (Eitan); Hoisie, A. (Adolfy)

    2001-01-01

    The efficient implementation of collective communication patterns in a parallel machine is a challenging design effort, that requires the solution of many problems. In this paper we present an in-depth description of how the Quadrics network supports both hardware- and software-based collectives. We describe the main features of the two building blocks of this network, a network interface that can perform zero-copy user-level communication and a wormhole switch. We also focus our attention on the routing and $ow control algorithms, deadlock avoidance and on how the processing nodes are integrated in a global, virtual shared memory. Experimental results conducted on 64-node AlphaServer cluster indicate that the time to complete the hardware-based barrier synchronization on the whole network is as low as 6 ps, with veiy good scalability. Good latency and scalability are also achieved with the software-based synchronization, which takes about 15 ps. With the broadcast, similar performance is achieved by the hardware- and software-based implementations, which can deliver messages of up to 256 b,ytes in 13 ps and can get a sustained bandwidth of 288 Mbyteshec on all the nodes, with wressages larger than 64KB. The hardware-based barrier is almost insensitive to the network congestion, with 93% of the synchronizations taking less than 20 ps. On the other hand, the software based implementation suflers from a signif cant performance degradation. In high load environments the hardware broadcast maintains a reasonably good performance, delivering messages up to 2KB in 200 ps, while the software broadcast suffers from slightly higher latencies inherited by the synchronization mechanism.

  14. Analysis of neural networks

    CERN Document Server

    Heiden, Uwe

    1980-01-01

    The purpose of this work is a unified and general treatment of activity in neural networks from a mathematical pOint of view. Possible applications of the theory presented are indica­ ted throughout the text. However, they are not explored in de­ tail for two reasons : first, the universal character of n- ral activity in nearly all animals requires some type of a general approach~ secondly, the mathematical perspicuity would suffer if too many experimental details and empirical peculiarities were interspersed among the mathematical investigation. A guide to many applications is supplied by the references concerning a variety of specific issues. Of course the theory does not aim at covering all individual problems. Moreover there are other approaches to neural network theory (see e.g. Poggio-Torre, 1978) based on the different lev­ els at which the nervous system may be viewed. The theory is a deterministic one reflecting the average be­ havior of neurons or neuron pools. In this respect the essay is writt...

  15. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks.

    Science.gov (United States)

    Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin

    2017-09-15

    Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller's direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20-40% while ensuring feasible data delay.

  16. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yunkai Wei

    2017-09-01

    Full Text Available Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs are an inexorable trend for Wireless Sensor Networks (WSNs, including Wireless Rechargeable Sensor Network (WRSNs. However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN controller’s direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20–40% while ensuring feasible data delay.

  17. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks

    Science.gov (United States)

    Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin

    2017-01-01

    Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller’s direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20–40% while ensuring feasible data delay. PMID:28914816

  18. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  19. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social...... and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...

  20. HANSIS software tool for the automated analysis of HOLZ lines

    Energy Technology Data Exchange (ETDEWEB)

    Holec, D., E-mail: david.holec@unileoben.ac.at [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom); Sridhara Rao, D.V.; Humphreys, C.J. [Department of Materials Science and Metallurgy, University of Cambridge, Pembroke Street, Cambridge CB2 3QZ (United Kingdom)

    2009-06-15

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns.

  1. First statistical analysis of Geant4 quality software metrics

    Science.gov (United States)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  2. Software-defined network abstractions and configuration interfaces for building programmable quantum networks

    Energy Technology Data Exchange (ETDEWEB)

    Dasari, Venkat [U.S. Army Research Laboratory, Aberdeen Proving Ground, MD; Sadlier, Ronald J [ORNL; Geerhart, Mr. Billy [U.S. Army Research Laboratory, Aberdeen Proving Ground, MD; Snow, Nikolai [U.S. Army Research Laboratory, Aberdeen Proving Ground, MD; Williams, Brian P [ORNL; Humble, Travis S [ORNL

    2017-01-01

    Well-defined and stable quantum networks are essential to realize functional quantum applications. Quantum networks are complex and must use both quantum and classical channels to support quantum applications like QKD, teleportation, and superdense coding. In particular, the no-cloning theorem prevents the reliable copying of quantum signals such that the quantum and classical channels must be highly coordinated using robust and extensible methods. We develop new network abstractions and interfaces for building programmable quantum networks. Our approach leverages new OpenFlow data structures and table type patterns to build programmable quantum networks and to support quantum applications.

  3. Technology Infusion of CodeSonar into the Space Network Ground Segment (RII07): Software Assurance Symposium Technical Summary

    Science.gov (United States)

    Benson, Markland J.

    2008-01-01

    Presents a source code analysis tool (CodeSonar) for use in the Space Network Ground Segment. The Space Network requires 99.9% proficiency and 97.0% availability of systems. Software has historically accounted for an annual average of 28% of the Space Network loss of availability and proficiency. CSCI A and CSCI B account for 42% of the previous eight months of software data loss. The technology infusion of CodeSonar into the Space Network Ground segment is meant to aid in determining the impact of the technology on the project both in the expenditure of effort and the technical results of the technology. Running a CodeSonar analysis and performing a preliminary review of the results averaged 3.5 minutes per finding (approximately 20 hours total). An additional 40 hours is estimated to analyze the 37 findings deemed too complex for the initial review. Using CodeSonar's tools to suppress known non-problems, delta tool runs will not repeat findings that have been marked as non-problems, further reducing the time needed for review. The 'non-interesting' finding rate of 70% is a large number, but filtering, search, and detailed contextual features of CodeSonar reduce the time per finding. Integration of the tool into the build process may also provide further savings by preventing developers from having to configure and operate the tool separately. These preliminary results show the tool to be easy to use and incorporate into the engineering process. These findings also provide significant potential improvements in proficiency and availability on the part of the software. As time-to-fix data become available a better cost trade can be made on person hours saved versus tool cost. Selective factors may be necessary to determine where best to apply CodeSonar to balance cost and benefits.

  4. New Results in Software Model Checking and Analysis

    Science.gov (United States)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  5. The Implication of Using NVivo Software in Qualitative Data Analysis ...

    African Journals Online (AJOL)

    2015-03-15

    Mar 15, 2015 ... However, Computer Assisted Qualitative Data Analysis Software. (CAQDAS) are increasingly being developed. .... approval was granted by Malawi's National Health Science. Research Committee. Data .... assumption of the tutorial is that the researcher has the basic understanding of the computer, which ...

  6. A Pedagogical Software for the Analysis of Loudspeaker Systems

    Science.gov (United States)

    Pueo, B.; Roma, M.; Escolano, J.; Lopez, J. J.

    2009-01-01

    In this paper, a pedagogical software for the design and analysis of loudspeaker systems is presented, with emphasis on training students in the interaction between system parameters. Loudspeakers are complex electromechanical system, whose behavior is neither intuitive nor easy to understand by inexperienced students. Although commercial…

  7. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  8. Software Product "Equilibrium" for Preparation and Analysis of Aquatic Solutions

    CERN Document Server

    Bontchev, G D; Ivanov, P I; Maslov, O D; Milanov, M V; Dmitriev, S N

    2003-01-01

    Software product "Equilibrium" for preparation and analysis of aquatic solutions is developed. The program allows determining analytical parameters of a solution, such as ionic force and pH. "Equilibrium" is able to calculate the ratio of existing ion forms in the solution, with respect to the hydrolysis and complexation in the presence of one or more ligands.

  9. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  10. Softwarization of Mobile Network Functions towards Agile and Energy Efficient 5G Architectures: A Survey

    Directory of Open Access Journals (Sweden)

    Dlamini Thembelihle

    2017-01-01

    Full Text Available Future mobile networks (MNs are required to be flexible with minimal infrastructure complexity, unlike current ones that rely on proprietary network elements to offer their services. Moreover, they are expected to make use of renewable energy to decrease their carbon footprint and of virtualization technologies for improved adaptability and flexibility, thus resulting in green and self-organized systems. In this article, we discuss the application of software defined networking (SDN and network function virtualization (NFV technologies towards softwarization of the mobile network functions, taking into account different architectural proposals. In addition, we elaborate on whether mobile edge computing (MEC, a new architectural concept that uses NFV techniques, can enhance communication in 5G cellular networks, reducing latency due to its proximity deployment. Besides discussing existing techniques, expounding their pros and cons and comparing state-of-the-art architectural proposals, we examine the role of machine learning and data mining tools, analyzing their use within fully SDN- and NFV-enabled mobile systems. Finally, we outline the challenges and the open issues related to evolved packet core (EPC and MEC architectures.

  11. Open Source software and social networks: disruptive alternatives for medical imaging.

    Science.gov (United States)

    Ratib, Osman; Rosset, Antoine; Heuberger, Joris

    2011-05-01

    In recent decades several major changes in computer and communication technology have pushed the limits of imaging informatics and PACS beyond the traditional system architecture providing new perspectives and innovative approach to a traditionally conservative medical community. Disruptive technologies such as the world-wide-web, wireless networking, Open Source software and recent emergence of cyber communities and social networks have imposed an accelerated pace and major quantum leaps in the progress of computer and technology infrastructure applicable to medical imaging applications. This paper reviews the impact and potential benefits of two major trends in consumer market software development and how they will influence the future of medical imaging informatics. Open Source software is emerging as an attractive and cost effective alternative to traditional commercial software developments and collaborative social networks provide a new model of communication that is better suited to the needs of the medical community. Evidence shows that successful Open Source software tools have penetrated the medical market and have proven to be more robust and cost effective than their commercial counterparts. Developed by developers that are themselves part of the user community, these tools are usually better adapted to the user's need and are more robust than traditional software programs being developed and tested by a large number of contributing users. This context allows a much faster and more appropriate development and evolution of the software platforms. Similarly, communication technology has opened up to the general public in a way that has changed the social behavior and habits adding a new dimension to the way people communicate and interact with each other. The new paradigms have also slowly penetrated the professional market and ultimately the medical community. Secure social networks allowing groups of people to easily communicate and exchange information

  12. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    Science.gov (United States)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  13. [Finite Element Analysis of Intravascular Stent Based on ANSYS Software].

    Science.gov (United States)

    Shi, Gengqiang; Song, Xiaobing

    2015-10-01

    This paper adopted UG8.0 to bulid the stent and blood vessel models. The models were then imported into the finite element analysis software ANSYS. The simulation results of ANSYS software showed that after endothelial stent implantation, the velocity of the blood was slow and the fluctuation of velocity was small, which meant the flow was relatively stable. When blood flowed through the endothelial stent, the pressure gradually became smaller, and the range of the pressure was not wide. The endothelial shear stress basically unchanged. In general, it can be concluded that the endothelial stents have little impact on the flow of blood and can fully realize its function.

  14. NEAT : an efficient network enrichment analysis test

    NARCIS (Netherlands)

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-01-01

    BACKGROUND: Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be

  15. Robust flux balance analysis of multiscale biochemical reaction networks.

    Science.gov (United States)

    Sun, Yuekai; Fleming, Ronan M T; Thiele, Ines; Saunders, Michael A

    2013-07-30

    Biological processes such as metabolism, signaling, and macromolecular synthesis can be modeled as large networks of biochemical reactions. Large and comprehensive networks, like integrated networks that represent metabolism and macromolecular synthesis, are inherently multiscale because reaction rates can vary over many orders of magnitude. They require special methods for accurate analysis because naive use of standard optimization systems can produce inaccurate or erroneously infeasible results. We describe techniques enabling off-the-shelf optimization software to compute accurate solutions to the poorly scaled optimization problems arising from flux balance analysis of multiscale biochemical reaction networks. We implement lifting techniques for flux balance analysis within the openCOBRA toolbox and demonstrate our techniques using the first integrated reconstruction of metabolism and macromolecular synthesis for E. coli. Our techniques enable accurate flux balance analysis of multiscale networks using off-the-shelf optimization software. Although we describe lifting techniques in the context of flux balance analysis, our methods can be used to handle a variety of optimization problems arising from analysis of multiscale network reconstructions.

  16. Design Criteria For Networked Image Analysis System

    Science.gov (United States)

    Reader, Cliff; Nitteberg, Alan

    1982-01-01

    Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.

  17. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  18. SIMA: Python software for analysis of dynamic fluorescence imaging data

    OpenAIRE

    Patrick eKaifosh; Jeffrey eZaremba; Nathan eDanielson; Attila eLosonczy

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scannin...

  19. Application of econometric and ecology analysis methods in physics software

    Science.gov (United States)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  20. Scalable software-defined optical networking with high-performance routing and wavelength assignment algorithms.

    Science.gov (United States)

    Lee, Chankyun; Cao, Xiaoyuan; Yoshikane, Noboru; Tsuritani, Takehiro; Rhee, June-Koo Kevin

    2015-10-19

    The feasibility of software-defined optical networking (SDON) for a practical application critically depends on scalability of centralized control performance. The paper, highly scalable routing and wavelength assignment (RWA) algorithms are investigated on an OpenFlow-based SDON testbed for proof-of-concept demonstration. Efficient RWA algorithms are proposed to achieve high performance in achieving network capacity with reduced computation cost, which is a significant attribute in a scalable centralized-control SDON. The proposed heuristic RWA algorithms differ in the orders of request processes and in the procedures of routing table updates. Combined in a shortest-path-based routing algorithm, a hottest-request-first processing policy that considers demand intensity and end-to-end distance information offers both the highest throughput of networks and acceptable computation scalability. We further investigate trade-off relationship between network throughput and computation complexity in routing table update procedure by a simulation study.

  1. Design of a stateless low-latency router architecture for green software-defined networking

    Science.gov (United States)

    Saldaña Cercós, Silvia; Ramos, Ramon M.; Ewald Eller, Ana C.; Martinello, Magnos; Ribeiro, Moisés. R. N.; Manolova Fagertun, Anna; Tafur Monroy, Idelfonso

    2015-01-01

    Expanding software defined networking (SDN) to transport networks requires new strategies to deal with the large number of flows that future core networks will have to face. New south-bound protocols within SDN have been proposed to benefit from having control plane detached from the data plane offering a cost- and energy-efficient forwarding engine. This paper presents an overview of a new approach named KeyFlow to simultaneously reduce latency, jitter, and power consumption in core network nodes. Results on an emulation platform indicate that round trip time (RTT) can be reduced above 50% compared to the reference protocol OpenFlow, specially when flow tables are densely populated. Jitter reduction has been demonstrated experimentally on a NetFPGA-based platform, and 57.3% power consumption reduction has been achieved.

  2. Analysis of Layered Social Networks

    Science.gov (United States)

    2006-09-01

    xiii List of Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv I. Introduction ...Islamiya JP Joint Publication JTC Joint Targeting Cycle KPP Key Player Problem MCDM Multi-Criteria Decision Making MP Mathematical Programming MST...ANALYSIS OF LAYERED SOCIAL NETWORKS I. Introduction “To know them means to eliminate them” - Colonel Mathieu in the movie, Battle of Algiers

  3. Novel software package for cross-platform transcriptome analysis (CPTRA).

    Science.gov (United States)

    Zhou, Xin; Su, Zhen; Sammons, R Douglas; Peng, Yanhui; Tranel, Patrick J; Stewart, C Neal; Yuan, Joshua S

    2009-10-08

    Next-generation sequencing techniques enable several novel transcriptome profiling approaches. Recent studies indicated that digital gene expression profiling based on short sequence tags has superior performance as compared to other transcriptome analysis platforms including microarrays. However, the transcriptomic analysis with tag-based methods often depends on available genome sequence. The use of tag-based methods in species without genome sequence should be complemented by other methods such as cDNA library sequencing. The combination of different next generation sequencing techniques like 454 pyrosequencing and Illumina Genome Analyzer (Solexa) will enable high-throughput and accurate global gene expression profiling in species with limited genome information. The combination of transcriptome data acquisition methods requires cross-platform transcriptome data analysis platforms, including a new software package for data processing. Here we presented a software package, CPTRA: Cross-Platform TRanscriptome Analysis, to analyze transcriptome profiling data from separate methods. The software package is available at http://people.tamu.edu/approximately syuan/cptra/cptra.html. It was applied to the case study of non-target site glyphosate resistance in horseweed; and the data was mined to discover resistance target gene(s). For the software, the input data included a long-read sequence dataset with proper annotation, and a short-read sequence tag dataset for the quantification of transcripts. By combining the two datasets, the software carries out the unique sequence tag identification, tag counting for transcript quantification, and cross-platform sequence matching functions, whereby the short sequence tags can be annotated with a function, level of expression, and Gene Ontology (GO) classification. Multiple sequence search algorithms were implemented and compared. The analysis highlighted the importance of transport genes in glyphosate resistance and identified

  4. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  5. Application of DIgSILENT POWERFACTORY software for modeling of industrial network relay protection system

    Directory of Open Access Journals (Sweden)

    Sučević Nikola

    2016-01-01

    Full Text Available This paper presents modeling of industrial network relay protection system using DIgSILENT PowerFactory software. The basis for the model of protection system is a model of a single substation in an industrial network. The paper presents the procedure for modeling of protective devices of 6 kV asynchronous motors, 6/0,4 kV/kV transformers as well as protection in the bus coupler and busbar protection. Protective relay system response for the simulated disturbances is shown in the paper.

  6. IT Career JumpStart An Introduction to PC Hardware, Software, and Networking

    CERN Document Server

    Alpern, Naomi J; Muller, Randy

    2011-01-01

    A practical approach for anyone looking to enter the IT workforce Before candidates can begin to prepare for any kind of certification, they need a basic understanding of the various hardware and software components used in a computer network. Aimed at aspiring IT professionals, this invaluable book strips down a network to its bare basics, and discusses this complex topic in a clear and concise manner so that IT beginners can confidently gain an understanding of fundamental IT concepts. In addition, a base knowledge has been established so that more advanced topics and technologies can be lea

  7. Software Defined Networking to support IP address mobility in future LTE network

    NARCIS (Netherlands)

    Karimzadeh Motallebi Azar, Morteza; Valtulina, Luca; van den Berg, Hans Leo; Pras, Aiko; Liebsch, Marco; Taleb, Tarik

    2017-01-01

    The existing LTE network architecture dose not scale well to increasing demands due to its highly centralized and hierarchical composition. In this paper we discuss the major modifications required in the current LTE network to realize a decentralized LTE architecture. Next, we develop two IP

  8. GNU Oflox: an academic software for the minimal cost network flow problem

    Directory of Open Access Journals (Sweden)

    Andrés M. Sajo-Castelli

    2013-07-01

    Full Text Available We present an open-source software package written for GNU Octave. The software is an implementation of the Simplex algorithm for the minimal cost network flow problem oriented towards the academic environment. The implementation supports the use of Big-M and Phase I/Phase II methods and it can also start from a given feasible solution. Flexibility of the package's output configuration provides many attractive possibilities. The outputs are plain editable \\LaTeX\\ files that can be modified and orchestrated to fit most academic needs. It can be used in examination materials, homework assignments or even form part of a project. The format used to describe the network is the DIMACS min file format to which a simple extension was added in order to support the description of feasible trees in the file.

  9. A Framework and Comparative Analysis of Control Plane Security of SDN and Conventional Networks

    OpenAIRE

    Abdou, AbdelRahman; van Oorschot, Paul C.; Wan, Tao

    2017-01-01

    Software defined networking implements the network control plane in an external entity, rather than in each individual device as in conventional networks. This architectural difference implies a different design for control functions necessary for essential network properties, e.g., loop prevention and link redundancy. We explore how such differences redefine the security weaknesses in the SDN control plane and provide a framework for comparative analysis which focuses on essential network pr...

  10. Quantitative software analysis of ultrasonographic textures in experimental testicular torsion.

    Science.gov (United States)

    Aslan, Mustafa; Kucukaslan, Ibrahim; Mulazimoglu, Serkan; Soyer, Tutku; Şenyücel, Mine; Çakmak, Murat; Scholbach, Jakob; Aslan, Selim

    2013-04-01

    Ultrasonography (US) has high diagnostic value in testicular torsion but is vulnerable to several potential errors, especially in the early period. Echotexture (ETX) analysis software provides a numerical expression of B-mode images and allows quantitative evaluation of blood flow due to ischemic damage using power Doppler US (PDUS) analysis. Our aim in this study was to determine the diagnostic value and effective parameters of EXT analysis software in the early period of torsion using B-mode and PDUS images. In this study, eight rats were used. Following anesthesia, the right testis was rotated to a 1080-degree counterclockwise position whereas the left testis was left in place to have a control group. B-mode and PDUS images of both sides were recorded with a portable US device immediately (0 hour) and 1 and 2 hours after torsion. The B-mode images were analyzed in terms of gradient, homogeneity, and contrast using the BS200pro software (BAB Digital Imaging System 2007, Ankara, Turkey). Intensity (I)-red and area (A)-red values were measured on PDUS images with the Pixelflux (Version 1.0, Chameleon-Software, Leipzig, Germany). The data were evaluated by the Mann-Whitney U and Wilcoxon tests. Data from B-mode US image EXT analysis showed no significant difference between the right and left testicles in 0 to 2 hours (p > 0.05). The values obtained from PDUS analysis (I-red and A-red) significantly decreased at the testicular torsion side at the end of the second hour (p 0.05) whereas the flow was significantly lower at 2 hours (p analysis. Georg Thieme Verlag KG Stuttgart · New York.

  11. STOMP: A Software Architecture for the Design and Simulation UAV-Based Sensor Networks

    Energy Technology Data Exchange (ETDEWEB)

    Jones, E D; Roberts, R S; Hsia, T C S

    2002-10-28

    This paper presents the Simulation, Tactical Operations and Mission Planning (STOMP) software architecture and framework for simulating, controlling and communicating with unmanned air vehicles (UAVs) servicing large distributed sensor networks. STOMP provides hardware-in-the-loop capability enabling real UAVs and sensors to feedback state information, route data and receive command and control requests while interacting with other real or virtual objects thereby enhancing support for simulation of dynamic and complex events.

  12. A Software-Defined Radio System for Intravehicular Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiangming Kong

    2010-01-01

    Full Text Available An intra-vehicular wireless sensor network is designed and implemented on a software-defined radio system. IUWB signal is chosen to carry the data packets. The MAC layer of the system follows the specification of the IEEE802.15.4 standard. The transceiver design, especially the receiver design, is detailed in the paper. The system design is validated through lab test setup.

  13. Software-Defined Networking Using OpenFlow: Protocols, Applications and Architectural Design Choices

    Directory of Open Access Journals (Sweden)

    Wolfgang Braun

    2014-05-01

    Full Text Available We explain the notion of software-defined networking (SDN, whose southbound interface may be implemented by the OpenFlow protocol. We describe the operation of OpenFlow and summarize the features of specification versions 1.0–1.4. We give an overview of existing SDN-based applications grouped by topic areas. Finally, we point out architectural design choices for SDN using OpenFlow and discuss their performance implications.

  14. Extension of the IsaViz software for the representation of metabolic and regulatory networks

    OpenAIRE

    Diogo Fernando Veiga; Pedro de Stege Cecconello; José Eduardo De Lucca; Luismar Marques Porto

    2005-01-01

    In this work we developed an extension of IsaViz software, a RDF (Resource Description Framework) authoring tool, designed to be a graphical environment to build models of metabolic and regulatory networks. This environment, called Metabolic IsaViz, was linked to a genomic library of types and was modeled on the basis of ontologies. Biochemical pathways included data at sequence level (e.g., the amino acid sequence of enzymes), besides kinetic and thermodynamic parameters for the reactions. M...

  15. An Approach to Data Analysis in 5G Networks

    Directory of Open Access Journals (Sweden)

    Lorena Isabel Barona López

    2017-02-01

    Full Text Available 5G networks expect to provide significant advances in network management compared to traditional mobile infrastructures by leveraging intelligence capabilities such as data analysis, prediction, pattern recognition and artificial intelligence. The key idea behind these actions is to facilitate the decision-making process in order to solve or mitigate common network problems in a dynamic and proactive way. In this context, this paper presents the design of Self-Organized Network Management in Virtualized and Software Defined Networks (SELFNET Analyzer Module, which main objective is to identify suspicious or unexpected situations based on metrics provided by different network components and sensors. The SELFNET Analyzer Module provides a modular architecture driven by use cases where analytic functions can be easily extended. This paper also proposes the data specification to define the data inputs to be taking into account in diagnosis process. This data specification has been implemented with different use cases within SELFNET Project, proving its effectiveness.

  16. New Algorithm and Software (BNOmics) for Inferring and Visualizing Bayesian Networks from Heterogeneous Big Biological and Genetic Data.

    Science.gov (United States)

    Gogoshin, Grigoriy; Boerwinkle, Eric; Rodin, Andrei S

    2017-04-01

    Bayesian network (BN) reconstruction is a prototypical systems biology data analysis approach that has been successfully used to reverse engineer and model networks reflecting different layers of biological organization (ranging from genetic to epigenetic to cellular pathway to metabolomic). It is especially relevant in the context of modern (ongoing and prospective) studies that generate heterogeneous high-throughput omics datasets. However, there are both theoretical and practical obstacles to the seamless application of BN modeling to such big data, including computational inefficiency of optimal BN structure search algorithms, ambiguity in data discretization, mixing data types, imputation and validation, and, in general, limited scalability in both reconstruction and visualization of BNs. To overcome these and other obstacles, we present BNOmics, an improved algorithm and software toolkit for inferring and analyzing BNs from omics datasets. BNOmics aims at comprehensive systems biology-type data exploration, including both generating new biological hypothesis and testing and validating the existing ones. Novel aspects of the algorithm center around increasing scalability and applicability to varying data types (with different explicit and implicit distributional assumptions) within the same analysis framework. An output and visualization interface to widely available graph-rendering software is also included. Three diverse applications are detailed. BNOmics was originally developed in the context of genetic epidemiology data and is being continuously optimized to keep pace with the ever-increasing inflow of available large-scale omics datasets. As such, the software scalability and usability on the less than exotic computer hardware are a priority, as well as the applicability of the algorithm and software to the heterogeneous datasets containing many data types-single-nucleotide polymorphisms and other genetic/epigenetic/transcriptome variables, metabolite

  17. Analysis and monitoring design for networks

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.; Flanagan, D.; Rowan, T.; Batsell, S.

    1998-06-01

    The idea of applying experimental design methodologies to develop monitoring systems for computer networks is relatively novel even though it was applied in other areas such as meteorology, seismology, and transportation. One objective of a monitoring system should always be to collect as little data as necessary to be able to monitor specific parameters of the system with respect to assigned targets and objectives. This implies a purposeful monitoring where each piece of data has a reason to be collected and stored for future use. When a computer network system as large and complex as the Internet is the monitoring subject, providing an optimal and parsimonious observing system becomes even more important. Many data collection decisions must be made by the developers of a monitoring system. These decisions include but are not limited to the following: (1) The type data collection hardware and software instruments to be used; (2) How to minimize interruption of regular network activities during data collection; (3) Quantification of the objectives and the formulation of optimality criteria; (4) The placement of data collection hardware and software devices; (5) The amount of data to be collected in a given time period, how large a subset of the available data to collect during the period, the length of the period, and the frequency of data collection; (6) The determination of the data to be collected (for instance, selection of response and explanatory variables); (7) Which data will be retained and how long (i.e., data storage and retention issues); and (8) The cost analysis of experiments. Mathematical statistics, and, in particular, optimal experimental design methods, may be used to address the majority of problems generated by 3--7. In this study, the authors focus their efforts on topics 3--5.

  18. Transmission analysis in WDM networks

    DEFF Research Database (Denmark)

    Rasmussen, Christian Jørgen

    1999-01-01

    This thesis describes the development of a computer-based simulator for transmission analysis in optical wavelength division multiplexing networks. A great part of the work concerns fundamental optical network simulator issues. Among these issues are identification of the versatility and user......-friendliness demands which such a simulator must meet, development of the "spectral window representation" for representation of the optical signals and finding an effective way of handling the optical signals in the computer memory. One important issue more is the rules for the determination of the order in which...... the different component models are invoked during the simulation of a system. A simple set of rules which makes it possible to simulate any network architectures is laid down. The modelling of the nonlinear fibre and the optical receiver is also treated. The work on the fibre concerns the numerical solution...

  19. Advance reservation access control using software-defined networking and tokens

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Joaquin; Jung, Eun-Sung; Kettimuthu, Rajkumar; Rao, Nageswara S. V.; Foster, Ian T.; Clark, Russ; Owen, Henry

    2018-02-01

    Advance reservation systems allow users to reserve dedicated bandwidth connection resources from advanced high-speed networks. A common use case for such systems is data transfers in distributed science environments in which a user wants exclusive access to the reservation. However, current advance network reservation methods cannot ensure exclusive access of a network reservation to the specific flow for which the user made the reservation. We present here a novel network architecture that addresses this limitation and ensures that a reservation is used only by the intended flow. We achieve this by leveraging software-defined networking (SDN) and token-based authorization. We use SDN to orchestrate and automate the reservation of networking resources, end-to-end and across multiple administrative domains, and tokens to create a strong binding between the user or application that requested the reservation and the flows provisioned by SDN. We conducted experiments on the ESNet 100G SDN testbed, and demonstrated that our system effectively protects authorized flows from competing traffic in the network. (C) 2017 Elsevier B.V. All rights reserved.

  20. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  1. ForceAtlas2, a continuous graph layout algorithm for handy network visualization designed for the Gephi software.

    Science.gov (United States)

    Jacomy, Mathieu; Venturini, Tommaso; Heymann, Sebastien; Bastian, Mathieu

    2014-01-01

    Gephi is a network visualization software used in various disciplines (social network analysis, biology, genomics...). One of its key features is the ability to display the spatialization process, aiming at transforming the network into a map, and ForceAtlas2 is its default layout algorithm. The latter is developed by the Gephi team as an all-around solution to Gephi users' typical networks (scale-free, 10 to 10,000 nodes). We present here for the first time its functioning and settings. ForceAtlas2 is a force-directed layout close to other algorithms used for network spatialization. We do not claim a theoretical advance but an attempt to integrate different techniques such as the Barnes Hut simulation, degree-dependent repulsive force, and local and global adaptive temperatures. It is designed for the Gephi user experience (it is a continuous algorithm), and we explain which constraints it implies. The algorithm benefits from much feedback and is developed in order to provide many possibilities through its settings. We lay out its complete functioning for the users who need a precise understanding of its behaviour, from the formulas to graphic illustration of the result. We propose a benchmark for our compromise between performance and quality. We also explain why we integrated its various features and discuss our design choices.

  2. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    Stramaglia, Maria Elena; The ATLAS collaboration

    2015-01-01

    The calibration of the Pixel detector fulfills two main purposes: to tune front-end registers for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied toghether to chips with dierent characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  3. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  4. Stromatoporoid biometrics using image analysis software: A first order approach

    Science.gov (United States)

    Wolniewicz, Pawel

    2010-04-01

    Strommetric is a new image analysis computer program that performs morphometric measurements of stromatoporoid sponges. The program measures 15 features of skeletal elements (pillars and laminae) visible in both longitudinal and transverse thin sections. The software is implemented in C++, using the Open Computer Vision (OpenCV) library. The image analysis system distinguishes skeletal elements from sparry calcite using Otsu's method for image thresholding. More than 150 photos of thin sections were used as a test set, from which 36,159 measurements were obtained. The software provided about one hundred times more data than the current method applied until now. The data obtained are reproducible, even if the work is repeated by different workers. Thus the method makes the biometric studies of stromatoporoids objective.

  5. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  6. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  7. Nonlinear data reconciliation in material flow analysis with software STAN

    Directory of Open Access Journals (Sweden)

    Oliver Cencic

    2016-11-01

    Full Text Available STAN is a freely available software that supports Material/Substance Flow Analysis (MFA/SFA under the consideration of data uncertainties. It is capable of performing nonlinear data reconciliation based on the conventional weighted least-squares minimization approach, and error propagation. This paper summarizes the mathematical foundation of the calculation algorithm implemented in STAN and demonstrates its use on a hypothetical example from MFA.

  8. PLAGIARISM DETECTION PROBLEMS AND ANALYSIS SOFTWARE TOOLS FOR ITS SOLVE

    Directory of Open Access Journals (Sweden)

    V. I. Shynkarenko

    2017-02-01

    Full Text Available Purpose. This study is aimed at: 1 the definition of plagiarism in texts on formal and natural languages, building a taxonomy of plagiarism; 2 identify major problems of plagiarism detection when using automated tools to solve them; 3 Analysis and systematization of information obtained during the review, testing and analysis of existing detection systems. Methodology. To identify the requirements of the software to detect plagiarism apply methods of analysis of normative documentation (legislative base and competitive tools. To check the requirements of the testing methods used and GUI interfaces review. Findings. The paper considers the concept of plagiarism issues of proliferation and classification. A review of existing systems to identify plagiarism: desktop applications, and online resources. Highlighting their functional characteristics, determine the format of the input and output data and constraints on them, customization features and access. Drill down system requirements is made. Originality. The authors proposed schemes complement the existing hierarchical taxonomy of plagiarism. Analysis of existing systems is done in terms of functionality and possibilities for use of large amounts of data. Practical value. The practical significance is determined by the breadth of the problem of plagiarism in various fields. In Ukraine, develops the legal framework for the fight against plagiarism, which requires the active solution development tasks, improvement and delivery of relevant software (PO. This work contributes to the solution of these problems. Review of existing programs, Anti-plagiarism, as well as study and research experience in the field and update the concept of plagiarism, the strategy allows it to identify more fully articulate to the functional performance requirements, the input and output of the developed software, as well as to identify the features of such software. The article focuses on the features of solving the

  9. Unified Multi-Layer among Software Defined Multi-Domain Optical Networks (Invited

    Directory of Open Access Journals (Sweden)

    Hui Yang

    2015-06-01

    Full Text Available The software defined networking (SDN enabled by OpenFlow protocol has gained popularity which can enable the network to be programmable and accommodate both fixed and flexible bandwidth services. In this paper, we present a unified multi-layer (UML architecture with multiple controllers and a dynamic orchestra plane (DOP for software defined multi-domain optical networks. The proposed architecture can shield the differences among various optical devices from multi-vendors and the details of connecting heterogeneous networks. The cross-domain services with on-demand bandwidth can be deployed via unified interfaces provided by the dynamic orchestra plane. Additionally, the globalization strategy and practical capture of signal processing are presented based on the architecture. The overall feasibility and efficiency of the proposed architecture is experimentally verified on the control plane of our OpenFlow-based testbed. The performance of globalization strategy under heavy traffic load scenario is also quantitatively evaluated based on UML architecture compared with other strategies in terms of blocking probability, average hops, and average resource consumption.

  10. PAX: A mixed hardware/software simulation platform for spiking neural networks.

    Science.gov (United States)

    Renaud, S; Tomas, J; Lewis, N; Bornat, Y; Daouzli, A; Rudolph, M; Destexhe, A; Saïghi, S

    2010-09-01

    Many hardware-based solutions now exist for the simulation of bio-like neural networks. Less conventional than software-based systems, these types of simulators generally combine digital and analog forms of computation. In this paper we present a mixed hardware-software platform, specifically designed for the simulation of spiking neural networks, using conductance-based models of neurons and synaptic connections with dynamic adaptation rules (Spike-Timing-Dependent Plasticity). The neurons and networks are configurable, and are computed in 'biological real time' by which we mean that the difference between simulated time and simulation time is guaranteed lower than 50 mus. After presenting the issues and context involved in the design and use of hardware-based spiking neural networks, we describe the analog neuromimetic integrated circuits which form the core of the platform. We then explain the organization and computation principles of the modules within the platform, and present experimental results which validate the system. Designed as a tool for computational neuroscience, the platform is exploited in collaborative research projects together with neurobiology and computer science partners. Copyright 2010 Elsevier Ltd. All rights reserved.

  11. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  12. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Science.gov (United States)

    Kaifosh, Patrick; Zaremba, Jeffrey D.; Danielson, Nathan B.; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/. PMID:25295002

  13. SIMA: Python software for analysis of dynamic fluorescence imaging data.

    Science.gov (United States)

    Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  14. Free software, business capital, and institutional change: a veblenian analysis of the software industry

    OpenAIRE

    Koloğlugil, Serhat

    2012-01-01

    Free software, unlike proprietary software under exclusive copyright control, exemplifies a form of productive and innovative activity that is based upon mutual sharing of technological knowledge. Free software engineers, who get connected through various software-development projects, voluntarily contribute their time and skills to produce computer programs which, they insist, should be free for anyone to use, modify, and distribute. This paper argues that Thorstein Veblen's socio-economic t...

  15. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  16. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  17. EDENetworks: a user-friendly software to build and analyse networks in biogeography, ecology and population genetics.

    Science.gov (United States)

    Kivelä, Mikko; Arnaud-Haond, Sophie; Saramäki, Jari

    2015-01-01

    The recent application of graph-based network theory analysis to biogeography, community ecology and population genetics has created a need for user-friendly software, which would allow a wider accessibility to and adaptation of these methods. EDENetworks aims to fill this void by providing an easy-to-use interface for the whole analysis pipeline of ecological and evolutionary networks starting from matrices of species distributions, genotypes, bacterial OTUs or populations characterized genetically. The user can choose between several different ecological distance metrics, such as Bray-Curtis or Sorensen distance, or population genetic metrics such as FST or Goldstein distances, to turn the raw data into a distance/dissimilarity matrix. This matrix is then transformed into a network by manual or automatic thresholding based on percolation theory or by building the minimum spanning tree. The networks can be visualized along with auxiliary data and analysed with various metrics such as degree, clustering coefficient, assortativity and betweenness centrality. The statistical significance of the results can be estimated either by resampling the original biological data or by null models based on permutations of the data. © 2014 John Wiley & Sons Ltd.

  18. Assessment of Software Modeling Techniques for Wireless Sensor Networks: A Survey

    Directory of Open Access Journals (Sweden)

    John Khalil Jacoub

    2012-03-01

    Full Text Available Wireless Sensor Networks (WSNs monitor environment phenomena and in some cases react in response to the observed phenomena. The distributed nature of WSNs and the interaction between software and hardware components makes it difficult to correctly design and develop WSN systems. One solution to the WSN design challenges is system modeling. In this paper we present a survey of 9 WSN modeling techniques and show how each technique models different parts of the system such as sensor behavior, sensor data and hardware. Furthermore, we consider how each modeling technique represents the network behavior and network topology. We also consider the available supporting tools for each of the modeling techniques. Based on the survey, we classify the modeling techniques and derive examples of the surveyed modeling techniques by using SensIV system.

  19. Energy-Aware Routing in Multiple Domains Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    Adriana FERNÁNDEZ-FERNÁNDEZ

    2016-12-01

    Full Text Available The growing energy consumption of communication networks has attracted the attention of the networking researchers in the last decade. In this context, the new architecture of Software-Defined Networks (SDN allows a flexible programmability, suitable for the power-consumption optimization problem. In this paper we address the issue of designing a novel distributed routing algorithm that optimizes the power consumption in large scale SDN with multiple domains. The solution proposed, called DEAR (Distributed Energy-Aware Routing, tackles the problem of minimizing the number of links that can be used to satisfy a given data traffic demand under performance constraints such as control traffic delay and link utilization. To this end, we present a complete formulation of the optimization problem that considers routing requirements for control and data plane communications. Simulation results confirm that the proposed solution enables the achievement of significant energy savings.

  20. Robust and Agile System against Fault and Anomaly Traffic in Software Defined Networks

    Directory of Open Access Journals (Sweden)

    Mihui Kim

    2017-03-01

    Full Text Available The main advantage of software defined networking (SDN is that it allows intelligent control and management of networking though programmability in real time. It enables efficient utilization of network resources through traffic engineering, and offers potential attack defense methods when abnormalities arise. However, previous studies have only identified individual solutions for respective problems, instead of finding a more global solution in real time that is capable of addressing multiple situations in network status. To cover diverse network conditions, this paper presents a comprehensive reactive system for simultaneously monitoring failures, anomalies, and attacks for high availability and reliability. We design three main modules in the SDN controller for a robust and agile defense (RAD system against network anomalies: a traffic analyzer, a traffic engineer, and a rule manager. RAD provides reactive flow rule generation to control traffic while detecting network failures, anomalies, high traffic volume (elephant flows, and attacks. The traffic analyzer identifies elephant flows, traffic anomalies, and attacks based on attack signatures and network monitoring. The traffic engineer module measures network utilization and delay in order to determine the best path for multi-dimensional routing and load balancing under any circumstances. Finally, the rule manager generates and installs a flow rule for the selected best path to control traffic. We implement the proposed RAD system based on Floodlight, an open source project for the SDN controller. We evaluate our system using simulation with and without the aforementioned RAD modules. Experimental results show that our approach is both practical and feasible, and can successfully augment an existing SDN controller in terms of agility, robustness, and efficiency, even in the face of link failures, attacks, and elephant flows.

  1. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  2. Software defined multi-OLT passive optical network for flexible traffic allocation

    Science.gov (United States)

    Zhang, Shizong; Gu, Rentao; Ji, Yuefeng; Zhang, Jiawei; Li, Hui

    2016-10-01

    With the rapid growth of 4G mobile network and vehicular network services mobile terminal users have increasing demand on data sharing among different radio remote units (RRUs) and roadside units (RSUs). Meanwhile, commercial video-streaming, video/voice conference applications delivered through peer-to-peer (P2P) technology are still keep on stimulating the sharp increment of bandwidth demand in both business and residential subscribers. However, a significant issue is that, although wavelength division multiplexing (WDM) and orthogonal frequency division multiplexing (OFDM) technology have been proposed to fulfil the ever-increasing bandwidth demand in access network, the bandwidth of optical fiber is not unlimited due to the restriction of optical component properties and modulation/demodulation technology, and blindly increase the wavelength cannot meet the cost-sensitive characteristic of the access network. In this paper, we propose a software defined multi-OLT PON architecture to support efficient scheduling of access network traffic. By introducing software defined networking technology and wavelength selective switch into TWDM PON system in central office, multiple OLTs can be considered as a bandwidth resource pool and support flexible traffic allocation for optical network units (ONUs). Moreover, under the configuration of the control plane, ONUs have the capability of changing affiliation between different OLTs under different traffic situations, thus the inter-OLT traffic can be localized and the data exchange pressure of the core network can be released. Considering this architecture is designed to be maximum following the TWDM PON specification, the existing optical distribution network (ODN) investment can be saved and conventional EPON/GPON equipment can be compatible with the proposed architecture. What's more, based on this architecture, we propose a dynamic wavelength scheduling algorithm, which can be deployed as an application on control plane

  3. Development of an Adaptive Routing Mechanism in Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    A. N. Noskov

    2015-01-01

    Full Text Available The purpose of this work is to develop a unitary mechanism of adaptive routing of different kinds, basing on the current requirements on the quality of service. The software configuration of a network is the technology of the future. The trend in communication systems constantly confirms this fact. However, the application of this technology in its current form is justified only in large networks of technology giants and telecom operators. Today we have a large number of dynamic routing protocols to route big volume traffic in communication networks. Our task is to create the solution that can use the opportunities of each node to make a decision on the transmission of information by all possible means for each type of traffic. Achieving this goal is possible by solving the problem of the development of generalized metrics, which details the links between devices in the network, and the problem of establishing a framework of adaptive logical network topology (route management to ensure the quality of the whole network in order to meet the current requirements on the quality of a particular type service.

  4. Open-source hardware and software and web application for gamma dose rate network operation.

    Science.gov (United States)

    Luff, R; Zähringer, M; Harms, W; Bleher, M; Prommer, B; Stöhlker, U

    2014-08-01

    The German Federal Office for Radiation Protection operates a network of about 1800 gamma dose rate stations as a part of the national emergency preparedness plan. Each of the six network centres is capable of operating the network alone. Most of the used hardware and software have been developed in-house under open-source license. Short development cycles and close cooperation between developers and users ensure robustness, transparency and fast maintenance procedures, thus avoiding unnecessary complex solutions. This also reduces the overall costs of the network operation. An easy-to-expand web interface has been developed to make the complete system available to other interested network operators in order to increase cooperation between different countries. The interface is also regularly in use for education during scholarships of trainees supported, e.g. by the 'International Atomic Energy Agency' to operate a local area dose rate monitoring test network. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Analysis of Semantic Networks using Complex Networks Concepts

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2013-01-01

    In this paper we perform a preliminary analysis of semantic networks to determine the most important terms that could be used to optimize a summarization task. In our experiments, we measure how the properties of a semantic network change, when the terms in the network are removed. Our preliminar...... results indicate that this approach provides good results on the semantic network analyzed in this paper....

  6. Forecast and restoration of geomagnetic activity indices by using the software-computational neural network complex

    Science.gov (United States)

    Barkhatov, Nikolay; Revunov, Sergey

    2010-05-01

    It is known that currently used indices of geomagnetic activity to some extent reflect the physical processes occurring in the interaction of the perturbed solar wind with Earth's magnetosphere. Therefore, they are connected to each other and with the parameters of near-Earth space. The establishment of such nonlinear connections is interest. For such purposes when the physical problem is complex or has many parameters the technology of artificial neural networks is applied. Such approach for development of the automated forecast and restoration method of geomagnetic activity indices with the establishment of creative software-computational neural network complex is used. Each neural network experiments were carried out at this complex aims to search for a specific nonlinear relation between the analyzed indices and parameters. At the core of the algorithm work program a complex scheme of the functioning of artificial neural networks (ANN) of different types is contained: back propagation Elman network, feed forward network, fuzzy logic network and Kohonen layer classification network. Tools of the main window of the complex (the application) the settings used by neural networks allow you to change: the number of hidden layers, the number of neurons in the layer, the input and target data, the number of cycles of training. Process and the quality of training the ANN is a dynamic plot of changing training error. Plot of comparison of network response with the test sequence is result of the network training. The last-trained neural network with established nonlinear connection for repeated numerical experiments can be run. At the same time additional training is not executed and the previously trained network as a filter input parameters get through and output parameters with the test event are compared. At statement of the large number of different experiments provided the ability to run the program in a "batch" mode is stipulated. For this purpose the user a

  7. Tracing Eurosceptic Party Networks via Hyperlink Network Analysis and Failing: Can Web Crawlers Keep up with Web Design?

    DEFF Research Database (Denmark)

    Bossetta, Michael; Dutceac Segesten, Anamaria

    This #FAIL! paper is the result of our experience with the hyperlink analysis software ‘Issuecrawler’ (www.govcom.org) whilst writing the paper “The Europeanization of Eurosceptics? A Hyperlink Network Analysis of the Sweden Democrats” for the 2015 European Consortium for Political Research (ECPR...... programming languages? Or, is its inability to read JavaScript (a limitation shared by all current web crawling software) a reason to fundamentally question its utility as a digital method?...

  8. COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks

    NARCIS (Netherlands)

    Sie, Rory

    2012-01-01

    Sie, R. L. L. (2012). COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks (Unpublished doctoral dissertation). September, 28, 2012, Open Universiteit in the Netherlands (CELSTEC), Heerlen, The Netherlands.

  9. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  10. JPL multipolarization workstation - Hardware, software and examples of data analysis

    Science.gov (United States)

    Burnette, Fred; Norikane, Lynne

    1987-01-01

    A low-cost stand-alone interactive image processing workstation has been developed for operations on multipolarization JPL aircraft SAR data, as well as data from future spaceborne imaging radars. A recently developed data compression technique is used to reduce the data volume to 10 Mbytes, for a typical data set, so that interactive analysis may be accomplished in a timely and efficient manner on a supermicrocomputer. In addition to presenting a hardware description of the work station, attention is given to the software that has been developed. Three illustrative examples of data analysis are presented.

  11. Integrating a flexible modeling framework (FMF) with the network security assessment instrument to reduce software security risk

    Science.gov (United States)

    Gilliam, D. P.; Powell, J. D.

    2002-01-01

    This paper presents a portion of an overall research project on the generation of the network security assessment instrument to aid developers in assessing and assuring the security of software in the development and maintenance lifecycles.

  12. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  13. Networks and network analysis for defence and security

    CERN Document Server

    Masys, Anthony J

    2014-01-01

    Networks and Network Analysis for Defence and Security discusses relevant theoretical frameworks and applications of network analysis in support of the defence and security domains. This book details real world applications of network analysis to support defence and security. Shocks to regional, national and global systems stemming from natural hazards, acts of armed violence, terrorism and serious and organized crime have significant defence and security implications. Today, nations face an uncertain and complex security landscape in which threats impact/target the physical, social, economic

  14. VESGEN 2D: Automated, User-Interactive Software for Vascular Quantification and Mapping of Angiogenic and Lymphangiogenic Trees and Networks

    Science.gov (United States)

    Vickerman, Mary B.; Keith, Patricia A.; McKay, Terri L.; Gedeon, Dan J.; Watanabe, Michiko; Montano, Monica; Karunamuni, Ganga; Kaiser, Peter K.; Sears, Jonathan E.; Ebrahem, Quteba; Ribita, Daniela; Hylton, Alan G.; Parsons-Wingerter, Patricia

    2010-01-01

    Quantification of microvascular remodeling as a meaningful discovery tool requires mapping and measurement of site-specific changes within vascular trees and networks. Vessel density and other critical vascular parameters are often modulated by molecular regulators as determined by local vascular architecture. For example, enlargement of vessel diameter by vascular endothelial growth factor (VEGF) is restricted to specific generations of vessel branching (Microvascular Research 72(3):91, 2006). The averaging of vessel diameter over many successively smaller generations is therefore not particularly useful. The newly automated, user-interactive software VESGEN (VESsel GENeration Analysis) quantifies major vessel parameters within two-dimensional (2D) vascular trees, networks, and tree-network composites. This report reviews application of VESGEN 2D to angiogenic and lymphangiogenic tissues that includes the human and murine retina, embryonic coronary vessels, and avian chorioallantoic membrane (CAM). Software output includes colorized image maps with quantification of local vessel diameter, fractal dimension, tortuosity and avascular spacing. The density of parameters such as vessel area, length, number and branch point are quantified according to site-specific generational branching within vascular trees. The sole user input requirement is a binary (black/white) vascular image. Future applications of VESGEN will include analysis of 3D vascular architecture and bioinformatic dimensions such as blood flow and receptor localization. Branching analysis by VESGEN has demonstrated that numerous regulators including VEGF165, basic fibroblast growth factor (bFGF), transforming growth factor β-1 (TGFβ-1), angiostatin and the clinical steroid triamcinolone acetonide induce ‘fingerprint’ or ‘signature’ changes in vascular patterning that provide unique readouts of dominant molecular signaling. PMID:19248164

  15. A Unified Robotic Software Architecture for Service Robotics and Networks of Smart Sensors

    Science.gov (United States)

    Westhoff, Daniel; Zhang, Jianwei

    This paper proposes a novel architecture for the programming of multi-modal service robots and networked sensors. The presented software framework eases the development of high-level applications for distributed systems. The software architecture is based upon the Roblet-Technology, which is an exceptionally powerful medium in robotics. The possibility to develop, compile and execute an application on one workstation and distribute parts of a program based on the idea of mobile code is pointed out. Since the Roblet-Technology uses Java the development is independent of the operation system. The framework hides the network communication and therefore greatly improves the programming and testing of applications in service robotics. The concept is evaluated in the context of the service robot TASER of the TAMS Institute at the University of Hamburg. This robot consists of a mobile platform with two manipulators equipped with artificial hands. Several multimodal input and output devices for interaction round off the robot. Networked cameras in the working environment of TASER provide additional information to the robot. The integration of these smart sensors shows the extendability of the proposed concept to general distributed systems.

  16. Exploratory Social Network Analysis of Management Faculty students on social media Facebook

    OpenAIRE

    Loušová, Petra

    2013-01-01

    The aim of the bachelor thesis is the introduction of modern approaches in analysis social networks. In theoretical part there is characterized various social media used in both the Czech Republic and in the world. Additionally discussed are fundamental elements of social networking and the ability of analysis using software. In practical part are these basic approaches applied for an exploratory analysis of the social network of students of University of Economics, Faculty of Management in s...

  17. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Nielsen Lars K

    2009-05-01

    Full Text Available Abstract Background The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i tracer cultivation on 13C substrates, (ii 13C labelling analysis by mass spectrometry and (iii mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. Results We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly ( Conclusion We have developed a fast, accurate application to perform steady-state 13C metabolic flux analysis. OpenFLUX will strongly facilitate and

  18. Unraveling protein networks with power graph analysis.

    Science.gov (United States)

    Royer, Loïc; Reimann, Matthias; Andreopoulos, Bill; Schroeder, Michael

    2008-07-11

    Networks play a crucial role in computational biology, yet their analysis and representation is still an open problem. Power Graph Analysis is a lossless transformation of biological networks into a compact, less redundant representation, exploiting the abundance of cliques and bicliques as elementary topological motifs. We demonstrate with five examples the advantages of Power Graph Analysis. Investigating protein-protein interaction networks, we show how the catalytic subunits of the casein kinase II complex are distinguishable from the regulatory subunits, how interaction profiles and sequence phylogeny of SH3 domains correlate, and how false positive interactions among high-throughput interactions are spotted. Additionally, we demonstrate the generality of Power Graph Analysis by applying it to two other types of networks. We show how power graphs induce a clustering of both transcription factors and target genes in bipartite transcription networks, and how the erosion of a phosphatase domain in type 22 non-receptor tyrosine phosphatases is detected. We apply Power Graph Analysis to high-throughput protein interaction networks and show that up to 85% (56% on average) of the information is redundant. Experimental networks are more compressible than rewired ones of same degree distribution, indicating that experimental networks are rich in cliques and bicliques. Power Graphs are a novel representation of networks, which reduces network complexity by explicitly representing re-occurring network motifs. Power Graphs compress up to 85% of the edges in protein interaction networks and are applicable to all types of networks such as protein interactions, regulatory networks, or homology networks.

  19. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  20. Software-based microwave CT system consisting of antennas and vector network analyzer.

    Science.gov (United States)

    Ogawa, Takahiro; Miyakawa, Michio

    2011-01-01

    We have developed a software-based microwave CT (SMCT) that consists of antennas and a vector network analyzer. Regardless of the scanner type, SMCT collects the S-parameters at each measurement position in the frequency range of interest. After collecting all the S-parameters, it calculates the shortest path to obtain the projection data for CPMCT. Because of the redundant data in SMCT, the calculation of the projection is easily optimized. Therefore, the system can improve the accuracy and stability of the measurement. Furthermore, the experimental system is constructed at a reasonable cost. Hence, SMCT is useful for imaging experiments for CP-MCT and particularly for basic studies. This paper describes the software-based microwave imaging system, and experimental results show the usefulness of the system.

  1. Collaboration, learning and innovation across outsourced services value networks software services outsourcing in China

    CERN Document Server

    Abbott, Pamela; Du, Rong

    2015-01-01

    This book collects and reports on the results of a study conducted on the Chinese Software and Services Outsourcing (SSO) industry, focusing on one of its main players as a key case study. Two sets of research findings are presented: first, the knowledge management and communication processes inherent within a highly collaborative software development project between the case study company and one of its long-term UK clients are explored and distilled into specific practices; second, at the organizational level, the strategies used by the company to build and exploit capabilities and to dynamically configure resources to promote specific value positions along its outsourced services value networks are identified and discussed. The significance of these findings for similar China-based global high-tech firms and the value of this organizational form in moving closer to the goals of the 2020 enterprise vision are both discussed, along with the implications of the findings for EU/UK businesses operating in simil...

  2. Mechanism of asymmetric software structures: A complex network perspective from behaviors of new nodes

    Science.gov (United States)

    Wang, Lei; Wang, Yu; Zhao, Yulong

    2014-11-01

    Studying the function call graphs in complex software can provide significant insights into the software evolution process. We found node in- and out-degree distributions asymmetric in the call graphs of 223 versions of Linux kernel modules (V1.1.0 to V2.4.35). Nodes newly introduced in these modules tended to attach to themselves (clustering) and existing high in-degree nodes. We proposed the αβ Model to generate call graphs for different kernel modules. The model preserved asymmetry in the degree distributions and simulated the new node behaviors. Last, we discussed how the αβ Model could be used effectively to study the robustness of complex networks.

  3. Delay Bounded Multi-Source Multicast in Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Thabo Semong

    2018-01-01

    Full Text Available Software-Defined Networking (SDN is the next generation network architecture with exciting application prospects. The control function in SDN is decoupled from the data forwarding plane, hence it provides a new centralized architecture with flexible network resource management. Although SDN is attracting much attention from both industry and research, its advantage over the traditional networks has not been fully utilized. Multicast is designed to deliver content to multiple destinations. The current traffic engineering in SDN focuses mainly on unicast, however, multicast can effectively reduce network resource consumption by serving multiple clients. This paper studies a novel delay-bounded multi-source multicast SDN problem, in which among the set of potential sources, we select a source to build the multicast-tree, under the constraint that the transmission delay for every destination is bounded. This problem is more difficult than the traditional Steiner minimum tree (SMT problem, since it needs to find a source from the set of all potential sources. We model the problem as a mixed-integer linear programming (MILP and prove its NP-Hardness. To solve the problem, a delay bounded multi-source (DBMS scheme is proposed, which includes a DBMS algorithm to build a minimum delay cost DBMS-Forest. Through a MATLAB experiment, we demonstrate that DBMS is significantly more efficient and outperforms other existing algorithms in the literature.

  4. Impact of Nodal Centrality Measures to Robustness in Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Tomas Hegr

    2014-01-01

    Full Text Available The paper deals with the network robustness from the perspective of nodal centrality measures and its applicability in Software-Defined Networking (SDN. Traditional graph characteristics have been evolving during the last century, and numerous of less-conventional metrics was introduced trying to bring a new view to some particular graph attributes. New control technologies can finally utilize these metrics but simultaneously show new challenges. SDN brings the fine-grained and nearly online view of the underlying network state which allows to implement an advanced routing and forwarding. In such situation, sophisticated algorithms can be applied utilizing pre-computed network measures. Since in recent version of SDN protocol OpenFlow (OF has been revived an idea of the fast link failover, the authors in this paper introduce a novel metric, Quality of Alternative Paths centrality (QAP. The QAP value quantifies node surroundings and can be with an advantage utilized in algorithms to indicate more robust paths. The centrality is evaluated using the node-failure simulation at different network topologies in combination with the Quality of Backup centrality measure.

  5. CÆLIS: software for assimilation, management and processing data of an atmospheric measurement network

    Science.gov (United States)

    Fuertes, David; Toledano, Carlos; González, Ramiro; Berjón, Alberto; Torres, Benjamín; Cachorro, Victoria E.; de Frutos, Ángel M.

    2018-02-01

    Given the importance of the atmospheric aerosol, the number of instruments and measurement networks which focus on its characterization are growing. Many challenges are derived from standardization of protocols, monitoring of the instrument status to evaluate the network data quality and manipulation and distribution of large volume of data (raw and processed). CÆLIS is a software system which aims at simplifying the management of a network, providing tools by monitoring the instruments, processing the data in real time and offering the scientific community a new tool to work with the data. Since 2008 CÆLIS has been successfully applied to the photometer calibration facility managed by the University of Valladolid, Spain, in the framework of Aerosol Robotic Network (AERONET). Thanks to the use of advanced tools, this facility has been able to analyze a growing number of stations and data in real time, which greatly benefits the network management and data quality control. The present work describes the system architecture of CÆLIS and some examples of applications and data processing.

  6. Signed Link Analysis in Social Media Networks

    OpenAIRE

    Beigi, Ghazaleh; Tang, Jiliang; Liu, Huan

    2016-01-01

    Numerous real-world relations can be represented by signed networks with positive links (e.g., trust) and negative links (e.g., distrust). Link analysis plays a crucial role in understanding the link formation and can advance various tasks in social network analysis such as link prediction. The majority of existing works on link analysis have focused on unsigned social networks. The existence of negative links determines that properties and principles of signed networks are substantially dist...

  7. Social network analysis in medical education

    OpenAIRE

    Isba, Rachel; Woolf, Katherine; Hanneman, Robert

    2016-01-01

    Content\\ud Humans are fundamentally social beings. The social systems within which we live our lives (families, schools, workplaces, professions, friendship groups) have a significant influence on our health, success and well-being. These groups can be characterised as networks and analysed using social network analysis.\\ud \\ud Social Network Analysis\\ud Social network analysis is a mainly quantitative method for analysing how relationships between individuals form and affect those individual...

  8. Introduction to the KWALON Experiment: Discussions on Qualitative Data Analysis Software by Developers and Users

    Directory of Open Access Journals (Sweden)

    Jeanine C. Evers

    2010-11-01

    Full Text Available In this introduction to the KWALON Experiment and related conference, we describe the motivations of the collaborating European networks in organising this joint endeavour. The KWALON Experiment consisted of five developers of Qualitative Data Analysis (QDA software analysing a dataset regarding the financial crisis in the time period 2008-2009, provided by the conference organisers. Besides this experiment, researchers were invited to present their reflective papers on the use of QDA software. This introduction gives a description of the experiment, the "rules", research questions and reflective points, as well as a full description of the dataset and search rules used, and our reflection on the lessons learned. The related conference is described, as are the papers which are included in this FQS issue. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1101405

  9. Prediction Model for Object Oriented Software Development Effort Estimation Using One Hidden Layer Feed Forward Neural Network with Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Chandra Shekhar Yadav

    2014-01-01

    Full Text Available The budget computation for software development is affected by the prediction of software development effort and schedule. Software development effort and schedule can be predicted precisely on the basis of past software project data sets. In this paper, a model for object-oriented software development effort estimation using one hidden layer feed forward neural network (OHFNN has been developed. The model has been further optimized with the help of genetic algorithm by taking weight vector obtained from OHFNN as initial population for the genetic algorithm. Convergence has been obtained by minimizing the sum of squared errors of each input vector and optimal weight vector has been determined to predict the software development effort. The model has been empirically validated on the PROMISE software engineering repository dataset. Performance of the model is more accurate than the well-established constructive cost model (COCOMO.

  10. [Applications of elementary mode analysis in biological network and pathway analysis].

    Science.gov (United States)

    Zhao, Quanyu; Yu, Shuiyan; Shi, Jiping

    2013-06-01

    Elementary mode analysis is the widely applied tool in metabolic pathway analysis. Studies based on elementary mode analysis (EMA) were performed for both metabolic network and signal transduction network. Its analytical objective is from cell to bioreactor, and even ecological system. EMA is available to describe biological behaviors by steady state and dynamic models. Not only microorganism metabolism but also human health could be evaluated by EMA. The algorithms and software for calculating elementary mode (EM) were analyzed. The applications of EMA are reviewed such as special metabolic pathway and robustness of metabolic network, metabolic flux decomposition, metabolic flux analysis at steady state, dynamic model and bioprocess simulation, network structure and regulation, strain design and signal transduction network. Solving combinatorial explosion, exploring the relations between EM and metabolic regulation, and improving the algorithm efficiency of strain design are important issues of EMA in future.

  11. Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.

    Science.gov (United States)

    Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.

    Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.

  12. An FPGA hardware/software co-design towards evolvable spiking neural networks for robotics application.

    Science.gov (United States)

    Johnston, S P; Prasad, G; Maguire, L; McGinnity, T M

    2010-12-01

    This paper presents an approach that permits the effective hardware realization of a novel Evolvable Spiking Neural Network (ESNN) paradigm on Field Programmable Gate Arrays (FPGAs). The ESNN possesses a hybrid learning algorithm that consists of a Spike Timing Dependent Plasticity (STDP) mechanism fused with a Genetic Algorithm (GA). The design and implementation direction utilizes the latest advancements in FPGA technology to provide a partitioned hardware/software co-design solution. The approach achieves the maximum FPGA flexibility obtainable for the ESNN paradigm. The algorithm was applied as an embedded intelligent system robotic controller to solve an autonomous navigation and obstacle avoidance problem.

  13. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  14. Sensor data validation and reconstruction in water networks : a methodology and software implementation

    OpenAIRE

    García Valverde, Diego; Quevedo Casín, Joseba Jokin; Puig Cayuela, Vicenç; Cugueró Escofet, Miquel Àngel

    2014-01-01

    In this paper, a data validation and reconstruction methodology that can be applied to the sensors used for real-time monitoring in water networks is presented. On the one hand, a validation approach based on quality levels is described to detect potential invalid and missing data. On the other hand, the reconstruction strategy is based on a set of temporal and spatial models used to estimate missing/invalid data with the model estimation providing the best fit. A software tool implementing t...

  15. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis.

    Science.gov (United States)

    Quek, Lake-Ee; Wittmann, Christoph; Nielsen, Lars K; Krömer, Jens O

    2009-05-01

    The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i) tracer cultivation on 13C substrates, (ii) 13C labelling analysis by mass spectrometry and (iii) mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU) framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly (studies. By providing the software open source, we hope it will evolve with the rapidly growing field of fluxomics.

  16. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  17. Discriminant Analysis of the Effects of Software Cost Drivers on ...

    African Journals Online (AJOL)

    The paper work investigates the effect of software cost drivers on project schedule estimation of software development projects in Nigeria. Specifically, the paper determines the extent to which software cost variables affect our software project time schedule in our environment. Such studies are lacking in the recent ...

  18. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  19. Integrating software architectures for distributed simulations and simulation analysis communities.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  20. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    CERN Document Server

    Donges, Jonathan F; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V; Marwan, Norbert; Dijkstra, Henk A; Kurths, Jürgen

    2015-01-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence qua...

  1. Structural Analysis of Complex Networks

    CERN Document Server

    Dehmer, Matthias

    2011-01-01

    Filling a gap in literature, this self-contained book presents theoretical and application-oriented results that allow for a structural exploration of complex networks. The work focuses not only on classical graph-theoretic methods, but also demonstrates the usefulness of structural graph theory as a tool for solving interdisciplinary problems. Applications to biology, chemistry, linguistics, and data analysis are emphasized. The book is suitable for a broad, interdisciplinary readership of researchers, practitioners, and graduate students in discrete mathematics, statistics, computer science,

  2. A Scheme to Optimize Flow Routing and Polling Switch Selection of Software Defined Networks

    Science.gov (United States)

    Chen, Huan; Li, Lemin; Ren, Jing; Wang, Yang; Zhao, Yangming; Wang, Xiong; Wang, Sheng; Xu, Shizhong

    2015-01-01

    This paper aims at minimizing the communication cost for collecting flow information in Software Defined Networks (SDN). Since flow-based information collecting method requires too much communication cost, and switch-based method proposed recently cannot benefit from controlling flow routing, jointly optimize flow routing and polling switch selection is proposed to reduce the communication cost. To this end, joint optimization problem is formulated as an Integer Linear Programming (ILP) model firstly. Since the ILP model is intractable in large size network, we also design an optimal algorithm for the multi-rooted tree topology and an efficient heuristic algorithm for general topology. According to extensive simulations, it is found that our method can save up to 55.76% communication cost compared with the state-of-the-art switch-based scheme. PMID:26690571

  3. A Scheme to Optimize Flow Routing and Polling Switch Selection of Software Defined Networks.

    Directory of Open Access Journals (Sweden)

    Huan Chen

    Full Text Available This paper aims at minimizing the communication cost for collecting flow information in Software Defined Networks (SDN. Since flow-based information collecting method requires too much communication cost, and switch-based method proposed recently cannot benefit from controlling flow routing, jointly optimize flow routing and polling switch selection is proposed to reduce the communication cost. To this end, joint optimization problem is formulated as an Integer Linear Programming (ILP model firstly. Since the ILP model is intractable in large size network, we also design an optimal algorithm for the multi-rooted tree topology and an efficient heuristic algorithm for general topology. According to extensive simulations, it is found that our method can save up to 55.76% communication cost compared with the state-of-the-art switch-based scheme.

  4. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  5. Analysis of Test Efficiency during Software Development Process

    OpenAIRE

    nair, T. R. GopalaKrishnan; Suma, V.; Tiwari, Pranesh Kumar

    2012-01-01

    One of the prerequisites of any organization is an unvarying sustainability in the dynamic and competitive industrial environment. Development of high quality software is therefore an inevitable constraint of any software industry. Defect management being one of the highly influencing factors for the production of high quality software, it is obligatory for the software organizations to orient them towards effective defect management. Since, the time of software evolution, testing is deemed a...

  6. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  7. BRAPH: A graph theory software for the analysis of brain connectivity.

    Directory of Open Access Journals (Sweden)

    Mite Mijalkov

    Full Text Available The brain is a large-scale complex network whose workings rely on the interaction between its various regions. In the past few years, the organization of the human brain network has been studied extensively using concepts from graph theory, where the brain is represented as a set of nodes connected by edges. This representation of the brain as a connectome can be used to assess important measures that reflect its topological architecture. We have developed a freeware MatLab-based software (BRAPH-BRain Analysis using graPH theory for connectivity analysis of brain networks derived from structural magnetic resonance imaging (MRI, functional MRI (fMRI, positron emission tomography (PET and electroencephalogram (EEG data. BRAPH allows building connectivity matrices, calculating global and local network measures, performing non-parametric permutations for group comparisons, assessing the modules in the network, and comparing the results to random networks. By contrast to other toolboxes, it allows performing longitudinal comparisons of the same patients across different points in time. Furthermore, even though a user-friendly interface is provided, the architecture of the program is modular (object-oriented so that it can be easily expanded and customized. To demonstrate the abilities of BRAPH, we performed structural and functional graph theory analyses in two separate studies. In the first study, using MRI data, we assessed the differences in global and nodal network topology in healthy controls, patients with amnestic mild cognitive impairment, and patients with Alzheimer's disease. In the second study, using resting-state fMRI data, we compared healthy controls and Parkinson's patients with mild cognitive impairment.

  8. BRAPH: A graph theory software for the analysis of brain connectivity.

    Science.gov (United States)

    Mijalkov, Mite; Kakaei, Ehsan; Pereira, Joana B; Westman, Eric; Volpe, Giovanni

    2017-01-01

    The brain is a large-scale complex network whose workings rely on the interaction between its various regions. In the past few years, the organization of the human brain network has been studied extensively using concepts from graph theory, where the brain is represented as a set of nodes connected by edges. This representation of the brain as a connectome can be used to assess important measures that reflect its topological architecture. We have developed a freeware MatLab-based software (BRAPH-BRain Analysis using graPH theory) for connectivity analysis of brain networks derived from structural magnetic resonance imaging (MRI), functional MRI (fMRI), positron emission tomography (PET) and electroencephalogram (EEG) data. BRAPH allows building connectivity matrices, calculating global and local network measures, performing non-parametric permutations for group comparisons, assessing the modules in the network, and comparing the results to random networks. By contrast to other toolboxes, it allows performing longitudinal comparisons of the same patients across different points in time. Furthermore, even though a user-friendly interface is provided, the architecture of the program is modular (object-oriented) so that it can be easily expanded and customized. To demonstrate the abilities of BRAPH, we performed structural and functional graph theory analyses in two separate studies. In the first study, using MRI data, we assessed the differences in global and nodal network topology in healthy controls, patients with amnestic mild cognitive impairment, and patients with Alzheimer's disease. In the second study, using resting-state fMRI data, we compared healthy controls and Parkinson's patients with mild cognitive impairment.

  9. Scalable Node-Centric Route Mutation for Defense of Large-Scale Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    Yang Zhou

    2017-01-01

    Full Text Available Exploiting software-defined networking techniques, randomly and instantly mutating routes can disguise strategically important infrastructure and protect the integrity of data networks. Route mutation has been to date formulated as NP-complete constraint satisfaction problem where feasible sets of routes need to be generated with exponential computational complexities, limiting algorithmic scalability to large-scale networks. In this paper, we propose a novel node-centric route mutation method which interprets route mutation as a signature matching problem. We formulate the route mutation problem as a three-dimensional earth mover’s distance (EMD model and solve it by using a binary branch and bound method. Considering the scalability, we further propose that a heuristic method yields significantly lower computational complexities with marginal loss of robustness against eavesdropping. Simulation results show that our proposed methods can effectively disguise key infrastructure by reducing the difference of historically accumulative traffic among different switches. With significantly reduced complexities, our algorithms are of particular interest to safeguard large-scale networks.

  10. Combining Topological Hardware and Topological Software: Color-Code Quantum Computing with Topological Superconductor Networks

    Directory of Open Access Journals (Sweden)

    Daniel Litinski

    2017-09-01

    Full Text Available We present a scalable architecture for fault-tolerant topological quantum computation using networks of voltage-controlled Majorana Cooper pair boxes and topological color codes for error correction. Color codes have a set of transversal gates which coincides with the set of topologically protected gates in Majorana-based systems, namely, the Clifford gates. In this way, we establish color codes as providing a natural setting in which advantages offered by topological hardware can be combined with those arising from topological error-correcting software for full-fledged fault-tolerant quantum computing. We provide a complete description of our architecture, including the underlying physical ingredients. We start by showing that in topological superconductor networks, hexagonal cells can be employed to serve as physical qubits for universal quantum computation, and we present protocols for realizing topologically protected Clifford gates. These hexagonal-cell qubits allow for a direct implementation of open-boundary color codes with ancilla-free syndrome read-out and logical T gates via magic-state distillation. For concreteness, we describe how the necessary operations can be implemented using networks of Majorana Cooper pair boxes, and we give a feasibility estimate for error correction in this architecture. Our approach is motivated by nanowire-based networks of topological superconductors, but it could also be realized in alternative settings such as quantum-Hall–superconductor hybrids.

  11. Topological Analysis of Wireless Networks (TAWN)

    Science.gov (United States)

    2016-05-31

    19b. TELEPHONE NUMBER (Include area code) 31-05-2016 FINAL REPORT 12-02-2015 -- 31-05-2016 Topological Analysis of Wireless Networks (TAWN) Robinson...mathematical literature on sheaves that describes how to draw global ( network -wide) inferences from them. Wireless network , local homology, sheaf...topology U U U UU 32 Michael Robinson 202-885-3681 Final Report: May 2016 Topological Analysis of Wireless Networks Principal Investigator: Prof. Michael

  12. Don't Blame the Software: Using Qualitative Data Analysis Software Successfully in Doctoral Research

    Directory of Open Access Journals (Sweden)

    Michelle Salmona

    2016-07-01

    Full Text Available In this article, we explore the learning experiences of doctoral candidates as they use qualitative data analysis software (QDAS. Of particular interest is the process of adopting technology during the development of research methodology. Using an action research approach, data was gathered over five years from advanced doctoral research candidates and supervisors. The technology acceptance model (TAM was then applied as a theoretical analytic lens for better understanding how students interact with new technology. Findings relate to two significant barriers which doctoral students confront: 1. aligning perceptions of ease of use and usefulness is essential in overcoming resistance to technological change; 2. transparency into the research process through technology promotes insights into methodological challenges. Transitioning through both barriers requires a competent foundation in qualitative research. The study acknowledges the importance of higher degree research, curriculum reform and doctoral supervision in post-graduate research training together with their interconnected relationships in support of high-quality inquiry. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1603117

  13. Sample EP Flow Analysis of Severely Damaged Networks

    Energy Technology Data Exchange (ETDEWEB)

    Werley, Kenneth Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCown, Andrew William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-12

    These are slides for a presentation at the working group meeting of the WESC SREMP Software Product Integration Team on sample EP flow analysis of severely damaged networks. The following topics are covered: ERCOT EP Transmission Model; Zoomed in to Houston and Overlaying StreetAtlas; EMPACT Solve/Dispatch/Shedding Options; QACS BaseCase Power Flow Solution; 3 Substation Contingency; Gen. & Load/100 Optimal Dispatch; Dispatch Results; Shed Load for Low V; Network Damage Summary; Estimated Service Areas (Potential); Estimated Outage Areas (potential).

  14. Real-time video streaming of sonographic clips using domestic internet networks and free videoconferencing software.

    Science.gov (United States)

    Liteplo, Andrew S; Noble, Vicki E; Attwood, Ben H C

    2011-11-01

    As the use of point-of-care sonography spreads, so too does the need for remote expert over-reading via telesonogrpahy. We sought to assess the feasibility of using familiar, widespread, and cost-effective existent technology to allow remote over-reading of sonograms in real time and to compare 4 different methods of transmission and communication for both the feasibility of transmission and image quality. Sonographic video clips were transmitted using 2 different connections (WiFi and 3G) and via 2 different videoconferencing modalities (iChat [Apple Inc, Cupertino, CA] and Skype [Skype Software Sàrl, Luxembourg]), for a total of 4 different permutations. The clips were received at a remote location and recorded and then scored by expert reviewers for image quality, resolution, and detail. Wireless transmission of sonographic clips was feasible in all cases when WiFi was used and when Skype was used over a 3G connection. Images transmitted via a WiFi connection were statistically superior to those transmitted via 3G in all parameters of quality (average P = .031), and those sent by iChat were superior to those sent by Skype but not statistically so (average P = .057). Wireless transmission of sonographic video clips using inexpensive hardware, free videoconferencing software, and domestic Internet networks is feasible with retention of image quality sufficient for interpretation. WiFi transmission results in greater image quality than transmission by a 3G network.

  15. Adaptive Multiclient Network-on-Chip Memory Core: Hardware Architecture, Software Abstraction Layer, and Application Exploration

    Directory of Open Access Journals (Sweden)

    Diana Göhringer

    2012-01-01

    Full Text Available This paper presents the hardware architecture and the software abstraction layer of an adaptive multiclient Network-on-Chip (NoC memory core. The memory core supports the flexibility of a heterogeneous FPGA-based runtime adaptive multiprocessor system called RAMPSoC. The processing elements, also called clients, can access the memory core via the Network-on-Chip (NoC. The memory core supports a dynamic mapping of an address space for the different clients as well as different data transfer modes, such as variable burst sizes. Therefore, two main limitations of FPGA-based multiprocessor systems, the restricted on-chip memory resources and that usually only one physical channel to an off-chip memory exists, are leveraged. Furthermore, a software abstraction layer is introduced, which hides the complexity of the memory core architecture and which provides an easy to use interface for the application programmer. Finally, the advantages of the novel memory core in terms of performance, flexibility, and user friendliness are shown using a real-world image processing application.

  16. An Object-Oriented Network-Centric Software Architecture for Physical Computing

    Science.gov (United States)

    Palmer, Richard

    1997-08-01

    Recent developments in object-oriented computer languages and infrastructure such as the Internet, Web browsers, and the like provide an opportunity to define a more productive computational environment for scientific programming that is based more closely on the underlying mathematics describing physics than traditional programming languages such as FORTRAN or C++. In this talk I describe an object-oriented software architecture for representing physical problems that includes classes for such common mathematical objects as geometry, boundary conditions, partial differential and integral equations, discretization and numerical solution methods, etc. In practice, a scientific program written using this architecture looks remarkably like the mathematics used to understand the problem, is typically an order of magnitude smaller than traditional FORTRAN or C++ codes, and hence easier to understand, debug, describe, etc. All objects in this architecture are ``network-enabled,'' which means that components of a software solution to a physical problem can be transparently loaded from anywhere on the Internet or other global network. The architecture is expressed as an ``API,'' or application programmers interface specification, with reference embeddings in Java, Python, and C++. A C++ class library for an early version of this API has been implemented for machines ranging from PC's to the IBM SP2, meaning that phidentical codes run on all architectures.

  17. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  18. ADFNE: Open source software for discrete fracture network engineering, two and three dimensional applications

    Science.gov (United States)

    Fadakar Alghalandis, Younes

    2017-05-01

    Rapidly growing topic, the discrete fracture network engineering (DFNE), has already attracted many talents from diverse disciplines in academia and industry around the world to challenge difficult problems related to mining, geothermal, civil, oil and gas, water and many other projects. Although, there are few commercial software capable of providing some useful functionalities fundamental for DFNE, their costs, closed code (black box) distributions and hence limited programmability and tractability encouraged us to respond to this rising demand with a new solution. This paper introduces an open source comprehensive software package for stochastic modeling of fracture networks in two- and three-dimension in discrete formulation. Functionalities included are geometric modeling (e.g., complex polygonal fracture faces, and utilizing directional statistics), simulations, characterizations (e.g., intersection, clustering and connectivity analyses) and applications (e.g., fluid flow). The package is completely written in Matlab scripting language. Significant efforts have been made to bring maximum flexibility to the functions in order to solve problems in both two- and three-dimensions in an easy and united way that is suitable for beginners, advanced and experienced users.

  19. Review Essay: Does Qualitative Network Analysis Exist?

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2007-01-01

    Full Text Available Social network analysis was formed and established in the 1970s as a way of analyzing systems of social relations. In this review the theoretical-methodological standpoint of social network analysis ("structural analysis" is introduced and the different forms of social network analysis are presented. Structural analysis argues that social actors and social relations are embedded in social networks, meaning that action and perception of actors as well as the performance of social relations are influenced by the network structure. Since the 1990s structural analysis has integrated concepts such as agency, discourse and symbolic orientation and in this way structural analysis has opened itself. Since then there has been increasing use of qualitative methods in network analysis. They are used to include the perspective of the analyzed actors, to explore networks, and to understand network dynamics. In the reviewed book, edited by Betina HOLLSTEIN and Florian STRAUS, the twenty predominantly empirically orientated contributions demonstrate the possibilities of combining quantitative and qualitative methods in network analyses in different research fields. In this review we examine how the contributions succeed in applying and developing the structural analysis perspective, and the self-positioning of "qualitative network analysis" is evaluated. URN: urn:nbn:de:0114-fqs0701287

  20. A networked modular hardware and software system for MRI-guided robotic prostate interventions

    Science.gov (United States)

    Su, Hao; Shang, Weijian; Harrington, Kevin; Camilo, Alex; Cole, Gregory; Tokuda, Junichi; Hata, Nobuhiko; Tempany, Clare; Fischer, Gregory S.

    2012-02-01

    Magnetic resonance imaging (MRI) provides high resolution multi-parametric imaging, large soft tissue contrast, and interactive image updates making it an ideal modality for diagnosing prostate cancer and guiding surgical tools. Despite a substantial armamentarium of apparatuses and systems has been developed to assist surgical diagnosis and therapy for MRI-guided procedures over last decade, the unified method to develop high fidelity robotic systems in terms of accuracy, dynamic performance, size, robustness and modularity, to work inside close-bore MRI scanner still remains a challenge. In this work, we develop and evaluate an integrated modular hardware and software system to support the surgical workflow of intra-operative MRI, with percutaneous prostate intervention as an illustrative case. Specifically, the distinct apparatuses and methods include: 1) a robot controller system for precision closed loop control of piezoelectric motors, 2) a robot control interface software that connects the 3D Slicer navigation software and the robot controller to exchange robot commands and coordinates using the OpenIGTLink open network communication protocol, and 3) MRI scan plane alignment to the planned path and imaging of the needle as it is inserted into the target location. A preliminary experiment with ex-vivo phantom validates the system workflow, MRI-compatibility and shows that the robotic system has a better than 0.01mm positioning accuracy.

  1. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  2. Palpebral fissure length measurement: accuracy of the FAS facial photographic analysis software and inaccuracy of the ruler.

    Science.gov (United States)

    Astley, Susan J

    2015-01-01

    Accurate fetal alcohol spectrum disorder diagnoses require accurate facial measurement. The Fetal Alcohol Syndrome (FAS) Facial Photographic Analysis Software was developed to overcome measurement error known to occur with ruler measurement of the PFL. Recent publications have queried the Software's accuracy. 1) Demonstrate the Software's ability to accurately measure a PFL from a 2-dimensional digital facial photograph. 2) Demonstrate the frequency and magnitude of error when the PFL is measured directly by clinicians using a ruler. Objective 1: PFLs of mannequins were measured using the Software and a sliding digital caliper, with the latter serving as the gold-standard accurate measure. Mannequins allowed the caliper prongs to be placed directly on the landmarks that define the PFL. Objective 2: PFLs of 1,027 patients evaluated at the University of Washington FAS Diagnostic & Prevention Network were measured with the Software and directly by one or two clinicians using a ruler. Objective 1: The Software derived PFLs that were identical to or within 0.2 mm of the caliper measures. Objective 2: There was tremendous inter-rater variability in PFLs measured by clinicians using a hand held ruler. Seventy-seven percent of patients had their PFLs measured incorrectly (greater than 1 mm error) by at least one of the two clinicians using a ruler. The FAS Facial Photographic Analysis Software measures the PFL with the same accuracy as a sliding digital caliper, as it was programmed to do. Direct measurement of the PFL with a ruler is very prone to error.

  3. Visualization techniques for the analysis of software behavior and related structures

    OpenAIRE

    Trümper, Jonas

    2014-01-01

    Software maintenance encompasses any changes made to a software system after its initial deployment and is thereby one of the key phases in the typical software-engineering lifecycle. In software maintenance, we primarily need to understand structural and behavioral aspects, which are difficult to obtain, e.g., by code reading. Software analysis is therefore a vital tool for maintaining these systems: It provides - the preferably automated - means to extract and evaluate information from thei...

  4. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  5. Network analysis literacy a practical approach to the analysis of networks

    CERN Document Server

    Zweig, Katharina A

    2014-01-01

    Network Analysis Literacy focuses on design principles for network analytics projects. The text enables readers to: pose a defined network analytic question; build a network to answer the question; choose or design the right network analytic methods for a particular purpose, and more.

  6. Social network analysis and dual rover communications

    Science.gov (United States)

    Litaker, Harry L.; Howard, Robert L.

    2013-10-01

    Social network analysis (SNA) refers to the collection of techniques, tools, and methods used in sociometry aiming at the analysis of social networks to investigate decision making, group communication, and the distribution of information. Human factors engineers at the National Aeronautics and Space Administration (NASA) conducted a social network analysis on communication data collected during a 14-day field study operating a dual rover exploration mission to better understand the relationships between certain network groups such as ground control, flight teams, and planetary science. The analysis identified two communication network structures for the continuous communication and Twice-a-Day Communication scenarios as a split network and negotiated network respectfully. The major nodes or groups for the networks' architecture, transmittal status, and information were identified using graphical network mapping, quantitative analysis of subjective impressions, and quantified statistical analysis using Sociometric Statue and Centrality. Post-questionnaire analysis along with interviews revealed advantages and disadvantages of each network structure with team members identifying the need for a more stable continuous communication network, improved robustness of voice loops, and better systems training/capabilities for scientific imagery data and operational data during Twice-a-Day Communications.

  7. Center of attention: A network text analysis of American Sniper

    Directory of Open Access Journals (Sweden)

    Starling Hunter

    2016-06-01

    Full Text Available Network Text Analysis (NTA is a term used to describe a variety of software - supported methods for modeling texts as networks of concepts. In this study we apply NTA to the screenplay of American Sniper, an Academy Award nominee for Best Adapted Screenplay in 2014. Specifically, we est ablish prior expectations as to the key themes associated with war films. We then empirically test whether words associated with the most influentially - positioned nodes in the network signify themes common to the war - film genre. As predicted, we find tha t words and concepts associated with the least constrained nodes in the text network were significantly more likely to be associated with the war genre and significantly less likely to be associated with genres to which the film did not belong.

  8. A New Wavelength Optimization and Energy-Saving Scheme Based on Network Coding in Software-Defined WDM-PON Networks

    Science.gov (United States)

    Ren, Danping; Wu, Shanshan; Zhang, Lijing

    2016-09-01

    In view of the characteristics of the global control and flexible monitor of software-defined networks (SDN), we proposes a new optical access network architecture dedicated to Wavelength Division Multiplexing-Passive Optical Network (WDM-PON) systems based on SDN. The network coding (NC) technology is also applied into this architecture to enhance the utilization of wavelength resource and reduce the costs of light source. Simulation results show that this scheme can optimize the throughput of the WDM-PON network, greatly reduce the system time delay and energy consumption.

  9. Software for the analysis and simulations of measurements; Software para analise e simulacao de medicoes

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Augusto Cesar Assis; Sarmento, Christiana Lauar; Mota, Geraldo Cesar; Domingos, Marileide Mourao; Belo, Noema Sant`Anna; Alves, Tulio Marcus Machado [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil)

    1992-12-31

    This paper shows the development of a graphic software which act as a system to analyze the behaviour of electric power measurements and permits the calculation of `percent errors`, derived from measure inexactness. The software will show, in each situation, the correct link diagram, the measurement diagram, the `percent error` and the graphic behaviour of this error, in function of the power charge factor. 14 figs., 4 refs.

  10. Dataworks for GNSS: Software for Supporting Data Sharing and Federation of Geodetic Networks

    Science.gov (United States)

    Boler, F. M.; Meertens, C. M.; Miller, M. M.; Wier, S.; Rost, M.; Matykiewicz, J.

    2015-12-01

    Continuously-operating Global Navigation Satellite System (GNSS) networks are increasingly being installed globally for a wide variety of science and societal applications. GNSS enables Earth science research in areas including tectonic plate interactions, crustal deformation in response to loading by tectonics, magmatism, water and ice, and the dynamics of water - and thereby energy transfer - in the atmosphere at regional scale. The many individual scientists and organizations that set up GNSS stations globally are often open to sharing data, but lack the resources or expertise to deploy systems and software to manage and curate data and metadata and provide user tools that would support data sharing. UNAVCO previously gained experience in facilitating data sharing through the NASA-supported development of the Geodesy Seamless Archive Centers (GSAC) open source software. GSAC provides web interfaces and simple web services for data and metadata discovery and access, supports federation of multiple data centers, and simplifies transfer of data and metadata to long-term archives. The NSF supported the dissemination of GSAC to multiple European data centers forming the European Plate Observing System. To expand upon GSAC to provide end-to-end, instrument-to-distribution capability, UNAVCO developed Dataworks for GNSS with NSF funding to the COCONet project, and deployed this software on systems that are now operating as Regional GNSS Data Centers as part of the NSF-funded TLALOCNet and COCONet projects. Dataworks consists of software modules written in Python and Java for data acquisition, management and sharing. There are modules for GNSS receiver control and data download, a database schema for metadata, tools for metadata handling, ingest software to manage file metadata, data file management scripts, GSAC, scripts for mirroring station data and metadata from partner GSACs, and extensive software and operator documentation. UNAVCO plans to provide a cloud VM

  11. Applications of Social Network Analysis

    Science.gov (United States)

    Thilagam, P. Santhi

    A social network [2] is a description of the social structure between actors, mostly persons, groups or organizations. It indicates the ways in which they are connected with each other by some relationship such as friendship, kinship, finance exchange etc. In a nutshell, when the person uses already known/unknown people to create new contacts, it forms social networking. The social network is not a new concept rather it can be formed when similar people interact with each other directly or indirectly to perform particular task. Examples of social networks include a friendship networks, collaboration networks, co-authorship networks, and co-employees networks which depict the direct interaction among the people. There are also other forms of social networks, such as entertainment networks, business Networks, citation networks, and hyperlink networks, in which interaction among the people is indirect. Generally, social networks operate on many levels, from families up to the level of nations and assists in improving interactive knowledge sharing, interoperability and collaboration.

  12. Differential dependency network analysis to identify condition-specific topological changes in biological networks.

    Science.gov (United States)

    Zhang, Bai; Li, Huai; Riggins, Rebecca B; Zhan, Ming; Xuan, Jianhua; Zhang, Zhen; Hoffman, Eric P; Clarke, Robert; Wang, Yue

    2009-02-15

    Significant efforts have been made to acquire data under different conditions and to construct static networks that can explain various gene regulation mechanisms. However, gene regulatory networks are dynamic and condition-specific; under different conditions, networks exhibit different regulation patterns accompanied by different transcriptional network topologies. Thus, an investigation on the topological changes in transcriptional networks can facilitate the understanding of cell development or provide novel insights into the pathophysiology of certain diseases, and help identify the key genetic players that could serve as biomarkers or drug targets. Here, we report a differential dependency network (DDN) analysis to detect statistically significant topological changes in the transcriptional networks between two biological conditions. We propose a local dependency model to represent the local structures of a network by a set of conditional probabilities. We develop an efficient learning algorithm to learn the local dependency model using the Lasso technique. A permutation test is subsequently performed to estimate the statistical significance of each learned local structure. In testing on a simulation dataset, the proposed algorithm accurately detected all the genes with network topological changes. The method was then applied to the estrogen-dependent T-47D estrogen receptor-positive (ER+) breast cancer cell line datasets and human and mouse embryonic stem cell datasets. In both experiments using real microarray datasets, the proposed method produced biologically meaningful results. We expect DDN to emerge as an important bioinformatics tool in transcriptional network analyses. While we focus specifically on transcriptional networks, the DDN method we introduce here is generally applicable to other biological networks with similar characteristics. The DDN MATLAB toolbox and experiment data are available at http://www.cbil.ece.vt.edu/software.htm.

  13. Understanding complex interactions using social network analysis.

    Science.gov (United States)

    Pow, Janette; Gayen, Kaberi; Elliott, Lawrie; Raeside, Robert

    2012-10-01

    The aim of this paper is to raise the awareness of social network analysis as a method to facilitate research in nursing research. The application of social network analysis in assessing network properties has allowed greater insight to be gained in many areas including sociology, politics, business organisation and health care. However, the use of social networks in nursing has not received sufficient attention. Review of literature and illustration of the application of the method of social network analysis using research examples. First, the value of social networks will be discussed. Then by using illustrative examples, the value of social network analysis to nursing will be demonstrated. The method of social network analysis is found to give greater insights into social situations involving interactions between individuals and has particular application to the study of interactions between nurses and between nurses and patients and other actors. Social networks are systems in which people interact. Two quantitative techniques help our understanding of these networks. The first is visualisation of the network. The second is centrality. Individuals with high centrality are key communicators in a network. Applying social network analysis to nursing provides a simple method that helps gain an understanding of human interaction and how this might influence various health outcomes. It allows influential individuals (actors) to be identified. Their influence on the formation of social norms and communication can determine the extent to which new interventions or ways of thinking are accepted by a group. Thus, working with key individuals in a network could be critical to the success and sustainability of an intervention. Social network analysis can also help to assess the effectiveness of such interventions for the recipient and the service provider. © 2012 Blackwell Publishing Ltd.

  14. Network meta-analysis, electrical networks and graph theory.

    Science.gov (United States)

    Rücker, Gerta

    2012-12-01

    Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Performance analysis of software for identification of intestinal parasites

    Directory of Open Access Journals (Sweden)

    Andressa P. Gomes

    2015-08-01

    Full Text Available ABSTRACTIntroduction:Intestinal parasites are among the most frequent diagnoses worldwide. An accurate clinical diagnosis of human parasitic infections depends on laboratory confirmation for specific differentiation of the infectious agent.Objectives:To create technological solutions to help parasitological diagnosis, through construction and use of specific software.Material and method:From the images obtained from the sediment, the software compares the morphometry, area, perimeter and circularity, and uses the information on specific morphological and staining characteristics of parasites and allows the potential identification of parasites.RESULTS:Our results demonstrate satisfactory performance, from a total of 204 images analyzed, 81.86% had the parasite correctly identified by the computer system, and 18.13% could not be identified, due to the large amount of fecal debris in the sample evaluated.Discussion:Currently the techniques used in Parasitology area are predominantly manual, probably being affected by variables, such as attention and experience of the professional. Therefore, the use of computerization in this sector can improve the performance of parasitological analysis.Conclusions:This work contributes to the computerization of healthcare area, and benefits both health professionals and their patients, in addition to provide a more efficient, accurate and secure diagnosis.

  16. Phenomenology and Qualitative Data Analysis Software (QDAS: A Careful Reconciliation

    Directory of Open Access Journals (Sweden)

    Brian Kelleher Sohn

    2017-01-01

    Full Text Available An oft-cited phenomenological methodologist, Max VAN MANEN (2014, claims that qualitative data analysis software (QDAS is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011 urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform processes aided by QDAS: disaggregation and recontextualization of texts. Other phenomenologists exemplify DAVIDSON and DI GREGORIO's observation that arguments against QDAS often identify problems more closely related to the researchers than QDAS. But the concerns about technology of McLUHAN (2003 [1964], HEIDEGGER (2008 [1977], and FLUSSER (2013 cannot be ignored. In this conceptual article I answer the questions of phenomenologists and the call of QDAS methodologists to describe how I used QDAS to carry out a phenomenological study in order to guide others who choose to reconcile the use of software to assist their research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1701142

  17. Topographic analysis of eyelid position using digital image processing software.

    Science.gov (United States)

    Chun, Yeoun Sook; Park, Hong Hyun; Park, In Ki; Moon, Nam Ju; Park, Sang Joon; Lee, Jeong Kyu

    2017-11-01

    To propose a novel analysis technique for objective quantification of topographic eyelid position with an algorithmatically calculated scheme and to determine its feasibility. One hundred normal eyelids from 100 patients were segmented using a graph cut algorithm, and 11 shape features of eyelids were semi-automatically quantified using in-house software. To evaluate the intra- and inter-examiner reliability of this software, intra-class correlation coefficients (ICCs) were used. To evaluate the diagnostic value of this scheme, the correlations between semi-automatic and manual measurements of margin reflex distance 1 (MRD1) and margin reflex distance 2 (MRD2) were analysed using a Bland-Altman analysis. To determine the degree of agreement according to manual MRD length, the relationship between the variance of semi-automatic measurements and the manual measurements was evaluated using linear regression. Intra- and inter-examiner reliability were excellent, with ICCs ranging from 0.913 to 0.980 in 11 shape features including MRD1, MRD2, palpebral fissure, lid perimeter, upper and lower lid lengths, roundness, total area, and medial, central, and lateral areas. The correlations between semi-automatic and manual MRDs were also excellent, with better correlation in MRD1 than in MRD2 (R = 0.893 and 0.823, respectively). In addition, significant positive relationships were observed between the variance and the length of MRD1 and 2; the longer the MRD length, the more the variance. The proposed novel optimized integrative scheme, which is shown to have high repeatability and reproducibility, is useful for topographic analysis of eyelid position. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  18. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  19. Multi-user software of radio therapeutical calculation using a computational network; Software multiusuario de calculo radioterapeutico usando una red de computo

    Energy Technology Data Exchange (ETDEWEB)

    Allaucca P, J.J.; Picon C, C.; Zaharia B, M. [Departamento de Radioterapia, Instituto de Enfermedades Neoplasicas, Av. Angamos Este 2520, Lima 34 (Peru)

    1998-12-31

    It has been designed a hardware and software system for a radiotherapy Department. It runs under an Operative system platform Novell Network sharing the existing resources and of the server, it is centralized, multi-user and of greater safety. It resolves a variety of problems and calculation necessities, patient steps and administration, it is very fast and versatile, it contains a set of menus and options which may be selected with mouse, direction arrows or abbreviated keys. (Author)

  20. THE NEED AND KEYS FOR A NEW GENERATION NETWORK ADJUSTMENT SOFTWARE

    Directory of Open Access Journals (Sweden)

    I. Colomina

    2012-07-01

    Full Text Available Orientation and calibration of photogrammetric and remote sensing instruments is a fundamental capacity of current mapping systems and a fundamental research topic. Neither digital remote sensing acquisition systems nor direct orientation gear, like INS and GNSS technologies, made block adjustment obsolete. On the contrary, the continuous flow of new primary data acquisition systems has challenged the capacity of the legacy block adjustment systems – in general network adjustment systems – in many aspects: extensibility, genericity, portability, large data sets capacity, metadata support and many others. In this article, we concentrate on the extensibility and genericity challenges that current and future network systems shall face. For this purpose we propose a number of software design strategies with emphasis on rigorous abstract modeling that help in achieving simplicity, genericity and extensibility together with the protection of intellectual proper rights in a flexible manner. We illustrate our suggestions with the general design approach of GENA, the generic extensible network adjustment system of GeoNumerics.

  1. Spectrum-space-divided spectrum allocation approaches in software-defined elastic optical networks

    Science.gov (United States)

    Chen, Bowen; Yu, Xiaosong; Zhao, Yongli

    2017-08-01

    Recently, the architecture of elastic optical network (EON) has been proposed as a candidate solution to accommodate both huge bandwidth requirements and flexible connections in next generation optical networks. In order to improve the spectrum efficiency, we propose different spectrum-space-divided approaches and develop two integer linear programming (ILP) models and several spectrum-space-divided spectrum allocation approaches with and without dedicated-path protection in software-defined elastic optical networks (SD-EONs). Simulation results show that the ILP models achieve better performance in terms of the number of frequency slots and hop counts than the proposed spectrum-space-divided spectrum allocation approaches with and without dedicated-path protection under the static scenario of connection requests. Furthermore, we apply the spectrum-space-divided spectrum allocation approaches with and without dedicated-path protection to reduce the blocking probability and to improve spectrum efficiency under the dynamic connection requests compared to the traditional first-fit spectrum allocation approach in SD-EONs.

  2. Dynamic Construction Scheme for Virtualization Security Service in Software-Defined Networks.

    Science.gov (United States)

    Lin, Zhaowen; Tao, Dan; Wang, Zhenji

    2017-04-21

    For a Software Defined Network (SDN), security is an important factor affecting its large-scale deployment. The existing security solutions for SDN mainly focus on the controller itself, which has to handle all the security protection tasks by using the programmability of the network. This will undoubtedly involve a heavy burden for the controller. More devastatingly, once the controller itself is attacked, the entire network will be paralyzed. Motivated by this, this paper proposes a novel security protection architecture for SDN. We design a security service orchestration center in the control plane of SDN, and this center physically decouples from the SDN controller and constructs SDN security services. We adopt virtualization technology to construct a security meta-function library, and propose a dynamic security service composition construction algorithm based on web service composition technology. The rule-combining method is used to combine security meta-functions to construct security services which meet the requirements of users. Moreover, the RETE algorithm is introduced to improve the efficiency of the rule-combining method. We evaluate our solutions in a realistic scenario based on OpenStack. Substantial experimental results demonstrate the effectiveness of our solutions that contribute to achieve the effective security protection with a small burden of the SDN controller.

  3. Capturing complexity: Mixing methods in the analysis of a European tobacco control policy network.

    Science.gov (United States)

    Weishaar, Heide; Amos, Amanda; Collin, Jeff

    Social network analysis (SNA), a method which can be used to explore networks in various contexts, has received increasing attention. Drawing on the development of European smoke-free policy, this paper explores how a mixed method approach to SNA can be utilised to investigate a complex policy network. Textual data from public documents, consultation submissions and websites were extracted, converted and analysed using plagiarism detection software and quantitative network analysis, and qualitative data from public documents and 35 interviews were thematically analysed. While the quantitative analysis enabled understanding of the network's structure and components, the qualitative analysis provided in-depth information about specific actors' positions, relationships and interactions. The paper establishes that SNA is suited to empirically testing and analysing networks in EU policymaking. It contributes to methodological debates about the antagonism between qualitative and quantitative approaches and demonstrates that qualitative and quantitative network analysis can offer a powerful tool for policy analysis.

  4. ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.

    Science.gov (United States)

    Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z

    2015-10-08

    Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the

  5. Statistical Analysis of Bus Networks in India

    CERN Document Server

    Chatterjee, Atanu; Ramadurai, Gitakrishnan

    2015-01-01

    Through the past decade the field of network science has established itself as a common ground for the cross-fertilization of exciting inter-disciplinary studies which has motivated researchers to model almost every physical system as an interacting network consisting of nodes and links. Although public transport networks such as airline and railway networks have been extensively studied, the status of bus networks still remains in obscurity. In developing countries like India, where bus networks play an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer some of the basic questions on its evolution, growth, robustness and resiliency. In this paper, we model the bus networks of major Indian cities as graphs in \\textit{L}-space, and evaluate their various statistical properties using concepts from network science. Our analysis reveals a wide spectrum of network topology with the common underlying feature of small-world property. We observe tha...

  6. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  7. Image analysis software for following progression of peripheral neuropathy

    Science.gov (United States)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  8. Modular Open-Source Software for Item Factor Analysis.

    Science.gov (United States)

    Pritikin, Joshua N; Hunter, Micheal D; Boker, Steven

    2015-06-01

    This paper introduces an Item Factor Analysis (IFA) module for OpenMx, a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation and manipulation of models. Modular organization of the source code facilitates the easy addition of item models, item parameter estimation algorithms, optimizers, test scoring algorithms, and fit diagnostics all within an integrated framework. Three short example scripts are presented for fitting item parameters, latent distribution parameters, and a multiple group model. The availability of both IFA and structural equation modeling in the same software is a step toward the unification of these two methodologies.

  9. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  10. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  11. Study of co-authorship network of papers in the Journal of Research in Medical Sciences using social network analysis

    Directory of Open Access Journals (Sweden)

    Firoozeh Zare-Farashbandi

    2014-01-01

    Full Text Available Background: Co-authorship is one of the most tangible forms of research collaboration. A co-authorship network is a social network in which the authors through participation in one or more publication through an indirect path have linked to each other. The present research using the social network analysis studied co-authorship network of 681 articles published in Journal of Research in Medical Sciences (JRMS during 2008-2012. Materials and Methods: The study was carried out with the scientometrics approach and using co-authorship network analysis of authors. The topology of the co-authorship network of 681 published articles in JRMS between 2008 and 2012 was analyzed using macro-level metrics indicators of network analysis such as density, clustering coefficient, components and mean distance. In addition, in order to evaluate the performance of each authors and countries in the network, the micro-level indicators such as degree centrality, closeness centrality and betweenness centrality as well as productivity index were used. The UCINET and NetDraw softwares were used to draw and analyze the co-authorship network of the papers. Results: The assessment of the authors productivity in this journal showed that the first ranks were belonged to only five authors, respectively. Furthermore, analysis of the co-authorship of the authors in the network demonstrated that in the betweenness centrality index, three authors of them had the good position in the network. They can be considered as the network leaders able to control the flow of information in the network compared with the other members based on the shortest paths. On the other hand, the key role of the network according to the productivity and centrality indexes was belonged to Iran, Malaysia and United States of America. Conclusion: Co-authorship network of JRMS has the characteristics of a small world network. In addition, the theory of 6° separation is valid in this network was also true.

  12. Egocentric social network analysis of pathological gambling.

    Science.gov (United States)

    Meisel, Matthew K; Clifton, Allan D; Mackillop, James; Miller, Joshua D; Campbell, W Keith; Goodie, Adam S

    2013-03-01

    To apply social network analysis (SNA) to investigate whether frequency and severity of gambling problems were associated with different network characteristics among friends, family and co-workers is an innovative way to look at relationships among individuals; the current study was the first, to our knowledge, to apply SNA to gambling behaviors. Egocentric social network analysis was used to characterize formally the relationships between social network characteristics and gambling pathology. Laboratory-based questionnaire and interview administration. Forty frequent gamblers (22 non-pathological gamblers, 18 pathological gamblers) were recruited from the community. The SNA revealed significant social network compositional differences between the two groups: pathological gamblers (PGs) had more gamblers, smokers and drinkers in their social networks than did non-pathological gamblers (NPGs). PGs had more individuals in their network with whom they personally gambled, smoked and drank than those with who were NPG. Network ties were closer to individuals in their networks who gambled, smoked and drank more frequently. Associations between gambling severity and structural network characteristics were not significant. Pathological gambling is associated with compositional but not structural differences in social networks. Pathological gamblers differ from non-pathological gamblers in the number of gamblers, smokers and drinkers in their social networks. Homophily within the networks also indicates that gamblers tend to be closer with other gamblers. This homophily may serve to reinforce addictive behaviors, and may suggest avenues for future study or intervention. © 2012 The Authors, Addiction © 2012 Society for the Study of Addiction.

  13. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  14. ABS-TrustSDN: An Agent-Based Simulator of Trust Strategies in Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    Iván García-Magariño

    2017-01-01

    Full Text Available Software-defined networks (SDNs have become a mechanism to separate the control plane and the data plane in the communication in networks. SDNs involve several challenges around their security and their confidentiality. Ideally, SDNs should incorporate autonomous and adaptive systems for controlling the routing to be able to isolate network resources that may be malfunctioning or whose security has been compromised with malware. The current work introduces a novel agent-based framework that simulates SDN isolation protocols by means of trust and reputation models. This way, SDN programmers may estimate the repercussions of certain isolation protocols based on trust models before actually deploying the protocol into the network.

  15. GENES - a software package for analysis in experimental statistics and quantitative genetics

    Directory of Open Access Journals (Sweden)

    Cosme Damião Cruz

    2013-06-01

    Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.

  16. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  17. The Connectome Visualization Utility: Software for Visualization of Human Brain Networks

    NARCIS (Netherlands)

    LaPlante, R.A.; Douw, L.; Tang, W.; Stufflebeam, S.M.

    2014-01-01

    In analysis of the human connectome, the connectivity of the human brain is collected from multiple imaging modalities and analyzed using graph theoretical techniques. The dimensionality of human connectivity data is high, and making sense of the complex networks in connectomics requires

  18. Satellite image analysis using neural networks

    Science.gov (United States)

    Sheldon, Roger A.

    1990-01-01

    The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.

  19. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...... illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources and information, and the exercise of power. The paper then examines some of the methodological challenges of social network analysis and how it can be combined with other...

  20. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  1. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.

    Science.gov (United States)

    Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-05

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  2. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software

    Directory of Open Access Journals (Sweden)

    Andrés Mejías

    2017-01-01

    Full Text Available There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc. makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  3. Social network analysis and supply chain management

    Directory of Open Access Journals (Sweden)

    Raúl Rodríguez Rodríguez

    2016-01-01

    Full Text Available This paper deals with social network analysis and how it could be integrated within supply chain management from a decision-making point of view. Even though the benefits of using social analysis have are widely accepted at both academic and industry/services context, there is still a lack of solid frameworks that allow decision-makers to connect the usage and obtained results of social network analysis – mainly both information and knowledge flows and derived results- with supply chain management objectives and goals. This paper gives an overview of social network analysis, the main social network analysis metrics, supply chain performance and, finally, it identifies how future frameworks could close the gap and link the results of social network analysis with the supply chain management decision-making processes.

  4. Analisis Simulasi Penerapan Algoritma OSPF Menggunakan RouteFlow pada Jaringan Software Defined Network (SDN

    Directory of Open Access Journals (Sweden)

    Ridha Muldina Negara

    2017-02-01

    Full Text Available Pada jaringan konvensional, konfigurasi protocol routing sangat tidak fleksibel, tidak efisien dan konfigurasi dilakukan pada tiap perangkat. Hal ini tentu saja tidak dapat memenuhi tuntutan operasional saat ini yang rata-rata memiliki jaringan yang besar dan perangkat jaringan yang memiliki spesifikasi berbeda. Software Defined Network (SDN muncul sebagai harapan untuk permasalahan kompleksitas jaringan konvensional. Paradigma baru SDN melakukan pemisahan antara control plane dan forwarding plane. RouteFlow merupakan salah satu komponen berbasis software yang dapat mengaplikasikan protocol routing konvensional pada jaringan SDN. Open Shortest Path First (OSPF merupakan sebuah protokol routing konvensional yang memiliki kemampuan untuk mendeteksi perubahan topologi jaringan dengan cepat dalam sebuah jaringan yang besar. Protokol routing OSPF diterapkan pada teknologi SDN menggunakan RouteFlow dengan tujuan untuk mempermudah dalam mengontrol jaringan dengan sistem terpusat. Convergence time dan parameter Quality of Service (Throughput, Delay, Jitter dan Packet Loss diukur dengan skenario pemutusan link, penambahan jumlah switch dan background traffic.  Hasil pengukuran convergence time menunjukan bahwa penambahan jumlah switch mempengaruhi pertambahan waktu konvergensi, sedangkan untuk parameter Quality of Service (QoS pada peningkatan topologi switch didapatkan hasil yang masih sesuai dengan standar ITU-T G.1010 namun apabila ditambahkan background traffic yang memenuhi 50% bandwidth jaringan maka QoS memburuk.

  5. SNAP: A General Purpose Network Analysis and Graph Mining Library.

    Science.gov (United States)

    Leskovec, Jure; Sosič, Rok

    2016-10-01

    Large networks are becoming a widely used abstraction for studying complex systems in a broad set of disciplines, ranging from social network analysis to molecular biology and neuroscience. Despite an increasing need to analyze and manipulate large networks, only a limited number of tools are available for this task. Here, we describe Stanford Network Analysis Platform (SNAP), a general-purpose, high-performance system that provides easy to use, high-level operations for analysis and manipulation of large networks. We present SNAP functionality, describe its implementational details, and give performance benchmarks. SNAP has been developed for single big-memory machines and it balances the trade-off between maximum performance, compact in-memory graph representation, and the ability to handle dynamic graphs where nodes and edges are being added or removed over time. SNAP can process massive networks with hundreds of millions of nodes and billions of edges. SNAP offers over 140 different graph algorithms that can efficiently manipulate large graphs, calculate structural properties, generate regular and random graphs, and handle attributes and meta-data on nodes and edges. Besides being able to handle large graphs, an additional strength of SNAP is that networks and their attributes are fully dynamic, they can be modified during the computation at low cost. SNAP is provided as an open source library in C++ as well as a module in Python. We also describe the Stanford Large Network Dataset, a set of social and information real-world networks and datasets, which we make publicly available. The collection is a complementary resource to our SNAP software and is widely used for development and benchmarking of graph analytics algorithms.

  6. 4th International Conference in Network Analysis

    CERN Document Server

    Koldanov, Petr; Pardalos, Panos

    2016-01-01

    The contributions in this volume cover a broad range of topics including maximum cliques, graph coloring, data mining, brain networks, Steiner forest, logistic and supply chain networks. Network algorithms and their applications to market graphs, manufacturing problems, internet networks and social networks are highlighted. The "Fourth International Conference in Network Analysis," held at the Higher School of Economics, Nizhny Novgorod in May 2014, initiated joint research between scientists, engineers and researchers from academia, industry and government; the major results of conference participants have been reviewed and collected in this Work. Researchers and students in mathematics, economics, statistics, computer science and engineering will find this collection a valuable resource filled with the latest research in network analysis.

  7. Slicing techniques applied to architectural analysis of legacy software

    OpenAIRE

    Rodrigues, Nuno F.

    2009-01-01

    Tese de doutoramento em Informática (ramo de conhecimento em Fundamentos da Computação) Program understanding is emerging as a key concern in software engineering. In a situation in which the only quality certificate of the running software artifact still is life-cycle endurance, customers and software producers are little prepared to modify or improve running code. However, faced with so risky a dependence on legacy software, managers are more and more prepared to spend resources to in...

  8. The Software Therapist: Usability Problem Diagnosis Through Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Sparks, Randall; Hartson, Rex

    2006-01-01

    The work we report on here addresses the problem of low return on investment in software usability engineering and offers support for usability practitioners in identifying, understanding, documenting...

  9. METHODOLOGY OF MATHEMATICAL ANALYSIS IN POWER NETWORK

    OpenAIRE

    Jerzy Szkutnik; Mariusz Kawecki

    2008-01-01

    Power distribution network analysis is taken into account. Based on correlation coefficient authors establish methodology of mathematical analysis useful in finding substations bear responsibility for power stoppage. Also methodology of risk assessment will be carried out.

  10. WESTPA: an interoperable, highly scalable software package for weighted ensemble simulation and analysis.

    Science.gov (United States)

    Zwier, Matthew C; Adelman, Joshua L; Kaus, Joseph W; Pratt, Adam J; Wong, Kim F; Rego, Nicholas B; Suárez, Ernesto; Lettieri, Steven; Wang, David W; Grabe, Michael; Zuckerman, Daniel M; Chong, Lillian T

    2015-02-10

    The weighted ensemble (WE) path sampling approach orchestrates an ensemble of parallel calculations with intermittent communication to enhance the sampling of rare events, such as molecular associations or conformational changes in proteins or peptides. Trajectories are replicated and pruned in a way that focuses computational effort on underexplored regions of configuration space while maintaining rigorous kinetics. To enable the simulation of rare events at any scale (e.g., atomistic, cellular), we have developed an open-source, interoperable, and highly scalable software package for the execution and analysis of WE simulations: WESTPA (The Weighted Ensemble Simulation Toolkit with Parallelization and Analysis). WESTPA scales to thousands of CPU cores and includes a suite of analysis tools that have been implemented in a massively parallel fashion. The software has been designed to interface conveniently with any dynamics engine and has already been used with a variety of molecular dynamics (e.g., GROMACS, NAMD, OpenMM, AMBER) and cell-modeling packages (e.g., BioNetGen, MCell). WESTPA has been in production use for over a year, and its utility has been demonstrated for a broad set of problems, ranging from atomically detailed host–guest associations to nonspatial chemical kinetics of cellular signaling networks. The following describes the design and features of WESTPA, including the facilities it provides for running WE simulations and storing and analyzing WE simulation data, as well as examples of input and output.

  11. [Statistical analysis using freely-available "EZR (Easy R)" software].

    Science.gov (United States)

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  12. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    2010-11-01

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  13. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    Science.gov (United States)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  14. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  15. How qualitative data analysis software may support the qualitative analysis process

    NARCIS (Netherlands)

    Peters, V.A.M.; Wester, F.P.J.

    2007-01-01

    The last decades have shown large progress in the elaboration of procedures for qualitative data analysis and in the development of computer programs to support this kind of analysis. We believe, however, that the link between methodology and computer software tools is too loose, especially for a

  16. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  17. Measuring Road Network Vulnerability with Sensitivity Analysis

    Science.gov (United States)

    Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin

    2017-01-01

    This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706

  18. Analisis Performansi Perutingan Link State Menggunakan Algoritma Djikstra Pada Platform Software Defined Network (SDN

    Directory of Open Access Journals (Sweden)

    Abu Riza Sudiyatmoko

    2016-05-01

    Full Text Available Software Defined Network (SDN merupakan paradigma baru dalam sistem jaringan. Konsep dasar yang diusung oleh SDN adalah pemisahan antara layer control dan forward dalam perangkat yang berbeda. Konsep inilah yang menjadi perbedaan anatar SDN dan jaringan konvensional. Selain itu SDN memberikan konsep network topology virtualisation dan memungkinkan administrator untuk melakukan customize pada control plane. Dengan diterapkannya protokol OpenFlow pada SDN maka terdapat peluang untuk menerapkan perutingan flow based pada jaringan SDN dalam pendistribusian data dari source sampai ke destination. Link state IS-IS merupakan protokol routing yang menggunakan algoritma djikstra untuk menentukan jalur terbaik dalam pendistribusian paket. Dalam penelitian ini dilakukan analisis terhadap implementasi Link State IS-IS pada paltform SDN dengan menggunakan arsitektur RouteFlow. Parameter yang digunakan adalah  throughput, delay, jitter dan packet loss serta performansi perangkat controller. Hasil pengujian pada kondisi overload yaitu dengan background traffic 125 Mb nilai packet loss mencapai 1,23%, nilai throughput 47,6 Mbp/s dan jitter 2.012 ms. Nilai delay terbesar adalah pada topology 11 switch 11 host yaitu berkisar diangka 553 ms. Sedangkan performansi perangkat controller  dengan konsumsi memory pada saat menjalankan mengontrol jaringan berkisar diantara 25,638%  sampai  39,04%

  19. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Chao-Chan Wu; Ching-Bang Yao

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  20. TweezPal - Optical tweezers analysis and calibration software

    Science.gov (United States)

    Osterman, Natan

    2010-11-01

    Optical tweezers, a powerful tool for optical trapping, micromanipulation and force transduction, have in recent years become a standard technique commonly used in many research laboratories and university courses. Knowledge about the optical force acting on a trapped object can be gained only after a calibration procedure which has to be performed (by an expert) for each type of trapped objects. In this paper we present TweezPal, a user-friendly, standalone Windows software tool for optical tweezers analysis and calibration. Using TweezPal, the procedure can be performed in a matter of minutes even by non-expert users. The calibration is based on the Brownian motion of a particle trapped in a stationary optical trap, which is being monitored using video or photodiode detection. The particle trajectory is imported into the software which instantly calculates position histogram, trapping potential, stiffness and anisotropy. Program summaryProgram title: TweezPal Catalogue identifier: AEGR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 44 891 No. of bytes in distributed program, including test data, etc.: 792 653 Distribution format: tar.gz Programming language: Borland Delphi Computer: Any PC running Microsoft Windows Operating system: Windows 95, 98, 2000, XP, Vista, 7 RAM: 12 Mbytes Classification: 3, 4.14, 18, 23 Nature of problem: Quick, robust and user-friendly calibration and analysis of optical tweezers. The optical trap is calibrated from the trajectory of a trapped particle undergoing Brownian motion in a stationary optical trap (input data) using two methods. Solution method: Elimination of the experimental drift in position data. Direct calculation of the trap stiffness from the positional

  1. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  2. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  3. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  4. An Initial Quality Analysis of the Ohloh Software Evolution Data

    NARCIS (Netherlands)

    Bruntink, M.

    2014-01-01

    Large public data sets on software evolution promise great value to both researchers and practitioners, in particular for software (development) analytics. To realise this value, the data quality of such data sets needs to be studied and improved. Despite these data sets being of a secondary nature,

  5. MicroScope: ChIP-seq and RNA-seq software analysis suite for gene expression heatmaps.

    Science.gov (United States)

    Khomtchouk, Bohdan B; Hennessy, James R; Wahlestedt, Claes

    2016-09-22

    Heatmaps are an indispensible visualization tool for examining large-scale snapshots of genomic activity across various types of next-generation sequencing datasets. However, traditional heatmap software do not typically offer multi-scale insight across multiple layers of genomic analysis (e.g., differential expression analysis, principal component analysis, gene ontology analysis, and network analysis) or multiple types of next-generation sequencing datasets (e.g., ChIP-seq and RNA-seq). As such, it is natural to want to interact with a heatmap's contents using an extensive set of integrated analysis tools applicable to a broad array of genomic data types. We propose a user-friendly ChIP-seq and RNA-seq software suite for the interactive visualization and analysis of genomic data, including integrated features to support differential expression analysis, interactive heatmap production, principal component analysis, gene ontology analysis, and dynamic network analysis. MicroScope is hosted online as an R Shiny web application based on the D3 JavaScript library: http://microscopebioinformatics.org/ . The methods are implemented in R, and are available as part of the MicroScope project at: https://github.com/Bohdan-Khomtchouk/Microscope .

  6. NEXCADE: perturbation analysis for complex networks.

    Directory of Open Access Journals (Sweden)

    Gitanjali Yadav

    Full Text Available Recent advances in network theory have led to considerable progress in our understanding of complex real world systems and their behavior in response to external threats or fluctuations. Much of this research has been invigorated by demonstration of the 'robust, yet fragile' nature of cellular and large-scale systems transcending biology, sociology, and ecology, through application of the network theory to diverse interactions observed in nature such as plant-pollinator, seed-dispersal agent and host-parasite relationships. In this work, we report the development of NEXCADE, an automated and interactive program for inducing disturbances into complex systems defined by networks, focusing on the changes in global network topology and connectivity as a function of the perturbation. NEXCADE uses a graph theoretical approach to simulate perturbations in a user-defined manner, singly, in clusters, or sequentially. To demonstrate the promise it holds for broader adoption by the research community, we provide pre-simulated examples from diverse real-world networks including eukaryotic protein-protein interaction networks, fungal biochemical networks, a variety of ecological food webs in nature as well as social networks. NEXCADE not only enables network visualization at every step of the targeted attacks, but also allows risk assessment, i.e. identification of nodes critical for the robustness of the system of interest, in order to devise and implement context-based strategies for restructuring a network, or to achieve resilience against link or node failures. Source code and license for the software, designed to work on a Linux-based operating system (OS can be downloaded at http://www.nipgr.res.in/nexcade_download.html. In addition, we have developed NEXCADE as an OS-independent online web server freely available to the scientific community without any login requirement at http://www.nipgr.res.in/nexcade.html.

  7. Software para análise quantitativa da deglutição Swallowing quantitative analysis software

    Directory of Open Access Journals (Sweden)

    André Augusto Spadotto

    2008-02-01

    Full Text Available OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software. CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.OBJECTIVE: The present paper is aimed at introducing a software to allow a detailed analysis of the swallowing dynamics. MATERIALS AND METHODS: The sample included ten (six male and four female stroke patients, with mean age of 57.6 years. Swallowing videofluoroscopy was performed and images were digitized for posterior analysis of the pharyngeal transit time with the aid of a chronometer and the software. RESULTS: Differences were observed in the average pharyngeal swallowing transit time as a result of measurements with chronometer and software. CONCLUSION: This software is a useful tool for the analysis of parameters such as swallowing time and speed, allowing a better understanding of the swallowing dynamics, both in the clinical approach of patients with oropharyngeal dysphagia and for scientific research purposes.

  8. Development of Network Analysis and Visualization System for KEGG Pathways

    Directory of Open Access Journals (Sweden)

    Dongmin Seo

    2015-07-01

    Full Text Available Big data refers to informationalization technology for extracting valuable information through the use and analysis of large-scale data and, based on that data, deriving plans for response or predicting changes. With the development of software and devices for next generation sequencing, a vast amount of bioinformatics data has been generated recently. Also, bioinformatics data based big-data technology is rising rapidly as a core technology by the bioinformatician, biologist and big-data scientist. KEGG pathway is bioinformatics data for understanding high-level functions and utilities of the biological system. However, KEGG pathway analysis requires a lot of time and effort because KEGG pathways are high volume and very diverse. In this paper, we proposed a network analysis and visualization system that crawl user interest KEGG pathways, construct a pathway network based on a hierarchy structure of pathways and visualize relations and interactions of pathways by clustering and selecting core pathways from the network. Finally, we construct a pathway network collected by starting with an Alzheimer’s disease pathway and show the results on clustering and selecting core pathways from the pathway network.

  9. Analysis of Software Binaries for Reengineering-Driven Product Line Architecture—An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Ian D. Peake

    2015-04-01

    Full Text Available This paper describes a method for the recovering of software architectures from a set of similar (but unrelated software products in binary form. One intention is to drive refactoring into software product lines and combine architecture recovery with run time binary analysis and existing clustering methods. Using our runtime binary analysis, we create graphs that capture the dependencies between different software parts. These are clustered into smaller component graphs, that group software parts with high interactions into larger entities. The component graphs serve as a basis for further software product line work. In this paper, we concentrate on the analysis part of the method and the graph clustering. We apply the graph clustering method to a real application in the context of automation / robot configuration software tools.

  10. Weighted Complex Network Analysis of Pakistan Highways

    Directory of Open Access Journals (Sweden)

    Yasir Tariq Mohmand

    2013-01-01

    Full Text Available The structure and properties of public transportation networks have great implications in urban planning, public policies, and infectious disease control. This study contributes a weighted complex network analysis of travel routes on the national highway network of Pakistan. The network is responsible for handling 75 percent of the road traffic yet is largely inadequate, poor, and unreliable. The highway network displays small world properties and is assortative in nature. Based on the betweenness centrality of the nodes, the most important cities are identified as this could help in identifying the potential congestion points in the network. Keeping in view the strategic location of Pakistan, such a study is of practical importance and could provide opportunities for policy makers to improve the performance of the highway network.

  11. Predictive structural dynamic network analysis.

    Science.gov (United States)

    Chen, Rong; Herskovits, Edward H

    2015-04-30

    Classifying individuals based on magnetic resonance data is an important task in neuroscience. Existing brain network-based methods to classify subjects analyze data from a cross-sectional study and these methods cannot classify subjects based on longitudinal data. We propose a network-based predictive modeling method to classify subjects based on longitudinal magnetic resonance data. Our method generates a dynamic Bayesian network model for each group which represents complex spatiotemporal interactions among brain regions, and then calculates a score representing that subject's deviation from expected network patterns. This network-derived score, along with other candidate predictors, are used to construct predictive models. We validated the proposed method based on simulated data and the Alzheimer's Disease Neuroimaging Initiative study. For the Alzheimer's Disease Neuroimaging Initiative study, we built a predictive model based on the baseline biomarker characterizing the baseline state and the network-based score which was constructed based on the state transition probability matrix. We found that this combined model achieved 0.86 accuracy, 0.85 sensitivity, and 0.87 specificity. For the Alzheimer's Disease Neuroimaging Initiative study, the model based on the baseline biomarkers achieved 0.77 accuracy. The accuracy of our model is significantly better than the model based on the baseline biomarkers (p-value=0.002). We have presented a method to classify subjects based on structural dynamic network model based scores. This method is of great importance to distinguish subjects based on structural network dynamics and the understanding of the network architecture of brain processes and disorders. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Quantitative analysis of histopathological findings using image processing software.

    Science.gov (United States)

    Horai, Yasushi; Kakimoto, Tetsuhiro; Takemoto, Kana; Tanaka, Masaharu

    2017-10-01

    In evaluating pathological changes in drug efficacy and toxicity studies, morphometric analysis can be quite robust. In this experiment, we examined whether morphometric changes of major pathological findings in various tissue specimens stained with hematoxylin and eosin could be recognized and quantified using image processing software. Using Tissue Studio, hypertrophy of hepatocytes and adrenocortical cells could be quantified based on the method of a previous report, but the regions of red pulp, white pulp, and marginal zones in the spleen could not be recognized when using one setting condition. Using Image-Pro Plus, lipid-derived vacuoles in the liver and mucin-derived vacuoles in the intestinal mucosa could be quantified using two criteria (area and/or roundness). Vacuoles derived from phospholipid could not be quantified when small lipid deposition coexisted in the liver and adrenal cortex. Mononuclear inflammatory cell infiltration in the liver could be quantified to some extent, except for specimens with many clustered infiltrating cells. Adipocyte size and the mean linear intercept could be quantified easily and efficiently using morphological processing and the macro tool equipped in Image-Pro Plus. These methodologies are expected to form a base system that can recognize morphometric features and analyze quantitatively pathological findings through the use of information technology.

  13. RNAstructure: software for RNA secondary structure prediction and analysis

    Directory of Open Access Journals (Sweden)

    Mathews David H

    2010-03-01

    Full Text Available Abstract Background To understand an RNA sequence's mechanism of action, the structure must be known. Furthermore, target RNA structure is an important consideration in the design of small interfering RNAs and antisense DNA oligonucleotides. RNA secondary structure prediction, using thermodynamics, can be used to develop hypotheses about the structure of an RNA sequence. Results RNAstructure is a software package for RNA secondary structure prediction and analysis. It uses thermodynamics and utilizes the most recent set of nearest neighbor parameters from the Turner group. It includes methods for secondary structure prediction (using several algorithms, prediction of base pair probabilities, bimolecular structure prediction, and prediction of a structure common to two sequences. This contribution describes new extensions to the package, including a library of C++ classes for incorporation into other programs, a user-friendly graphical user interface written in JAVA, and new Unix-style text interfaces. The original graphical user interface for Microsoft Windows is still maintained. Conclusion The extensions to RNAstructure serve to make RNA secondary structure prediction user-friendly. The package is available for download from the Mathews lab homepage at http://rna.urmc.rochester.edu/RNAstructure.html.

  14. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    National Research Council Canada - National Science Library

    Titus Felix FURTUNĂ; Claudiu VINȚE

    2016-01-01

    Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results...

  15. The Feasibility Study of Implementing a Fiber Optic Local Area Network in Software Metrics Laboratory in Ingersoll 158

    National Research Council Canada - National Science Library

    Be, Chai

    2004-01-01

    ... fiber components compared to the increase electronic costs of carrying Gigabit Ethernet over Cat 5 or Cat SE UTP copper cabling has also accelerated the migration to optical fiber LAN. The thesis conducts a feasibility study of implementing a Fiber Optic Local Area Network in Software Metrics Laboratory in Ingersoll 158.

  16. NEAT: an efficient network enrichment analysis test.

    Science.gov (United States)

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-09-05

    Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).

  17. Reaction network analysis in biochemical signaling pathways

    OpenAIRE

    Martinez-Forero, I. (Iván); Pelaez, A. (Antonio); Villoslada, P. (Pablo)

    2010-01-01

    The aim of this thesis is to improve the understanding of signaling pathways through a theoretical study of chemical reaction networks. The equilibirum solution to the equations derived from chemical networks will be analytically resolved using tools from algebraic geometry. The chapters are organized as follows: 1. An introduction to chemical dynamics in biological systems with a special emphasis on steady state analysis 2. Complete description of the chemical reaction network theor...

  18. A Software for the Analysis of Scripted Dialogs Based on Surface Markers

    Directory of Open Access Journals (Sweden)

    Sylvain Delisle

    2003-04-01

    Full Text Available Most information systems that deal with natural language texts do not tolerate much deviation from their idealized and simplified model of language. Spoken dialog is notoriously ungrammatical however. Because the MAREDI project focuses in particular on the automatic analysis of scripted dialogs, we needed to develop a robust capacity to analyze transcribed spoken language. This paper presents the main elements of our approach, which is based on exploiting surface markers as the best route to the semantics of the conversation modelled. We highlight the foundations of our particular conversational model and give an overview of the MAREDI system. The latter consists of three key modules, which are 1 a connectionist network to recognise speech acts, 2 a robust syntactic parser, and 3 a semantic analyzer. These three modules are fully implemented in Prolog and C++ and have been packaged into an integrated software.

  19. Industrial entrepreneurial network: Structural and functional analysis

    Science.gov (United States)

    Medvedeva, M. A.; Davletbaev, R. H.; Berg, D. B.; Nazarova, J. J.; Parusheva, S. S.

    2016-12-01

    Structure and functioning of two model industrial entrepreneurial networks are investigated in the present paper. One of these networks is forming when implementing an integrated project and consists of eight agents, which interact with each other and external environment. The other one is obtained from the municipal economy and is based on the set of the 12 real business entities. Analysis of the networks is carried out on the basis of the matrix of mutual payments aggregated over the certain time period. The matrix is created by the methods of experimental economics. Social Network Analysis (SNA) methods and instruments were used in the present research. The set of basic structural characteristics was investigated: set of quantitative parameters such as density, diameter, clustering coefficient, different kinds of centrality, and etc. They were compared with the random Bernoulli graphs of the corresponding size and density. Discovered variations of random and entrepreneurial networks structure are explained by the peculiarities of agents functioning in production network. Separately, were identified the closed exchange circuits (cyclically closed contours of graph) forming an autopoietic (self-replicating) network pattern. The purpose of the functional analysis was to identify the contribution of the autopoietic network pattern in its gross product. It was found that the magnitude of this contribution is more than 20%. Such value allows using of the complementary currency in order to stimulate economic activity of network agents.

  20. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...