WorldWideScience

Sample records for network analysis computer

  1. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  2. Computational Social Network Analysis

    CERN Document Server

    Hassanien, Aboul-Ella

    2010-01-01

    Presents insight into the social behaviour of animals (including the study of animal tracks and learning by members of the same species). This book provides web-based evidence of social interaction, perceptual learning, information granulation and the behaviour of humans and affinities between web-based social networks

  3. Computer methods in electric network analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saver, P.; Hajj, I.; Pai, M.; Trick, T.

    1983-06-01

    The computational algorithms utilized in power system analysis have more than just a minor overlap with those used in electronic circuit computer aided design. This paper describes the computer methods that are common to both areas and highlights the differences in application through brief examples. Recognizing this commonality has stimulated the exchange of useful techniques in both areas and has the potential of fostering new approaches to electric network analysis through the interchange of ideas.

  4. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  5. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  6. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    for traffic classification, which can be used for nearly real-time processing of big amounts of data using affordable CPU and memory resources. Other questions are related to methods for real-time estimation of the application Quality of Service (QoS) level based on the results obtained by the traffic......Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models...... classifier. This thesis is focused on topics connected with traffic classification and analysis, while the work on methods for QoS assessment is limited to defining the connections with the traffic classification and proposing a general algorithm. We introduced the already known methods for traffic...

  7. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  8. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  9. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  10. Computational tools for large-scale biological network analysis

    OpenAIRE

    Pinto, José Pedro Basto Gouveia Pereira

    2012-01-01

    Tese de doutoramento em Informática The surge of the field of Bioinformatics, among other contributions, provided biological researchers with powerful computational methods for processing and analysing the large amount of data coming from recent biological experimental techniques such as genome sequencing and other omics. Naturally, this led to the opening of new avenues of biological research among which is included the analysis of large-scale biological networks. The an...

  11. Applying DNA computation to intractable problems in social network analysis.

    Science.gov (United States)

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Computational analysis of protein interaction networks for infectious diseases.

    Science.gov (United States)

    Pan, Archana; Lahiri, Chandrajit; Rajendiran, Anjana; Shanmugham, Buvaneswari

    2016-05-01

    Infectious diseases caused by pathogens, including viruses, bacteria and parasites, pose a serious threat to human health worldwide. Frequent changes in the pattern of infection mechanisms and the emergence of multidrug-resistant strains among pathogens have weakened the current treatment regimen. This necessitates the development of new therapeutic interventions to prevent and control such diseases. To cater to the need, analysis of protein interaction networks (PINs) has gained importance as one of the promising strategies. The present review aims to discuss various computational approaches to analyse the PINs in context to infectious diseases. Topology and modularity analysis of the network with their biological relevance, and the scenario till date about host-pathogen and intra-pathogenic protein interaction studies were delineated. This would provide useful insights to the research community, thereby enabling them to design novel biomedicine against such infectious diseases. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  13. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  14. Spacelab data analysis using the space plasma computer analysis network (SCAN) system

    Science.gov (United States)

    Green, J. L.

    1984-01-01

    The Space-plasma Computer Analysis Network (SCAN) currently connects a large number of U.S. Spacelab investigators into a common computer network. Used primarily by plasma physics researchers at present, SCAN provides access to Spacelab investigators in other areas of space science, to Spacelab and non-Spacelab correlative data bases, and to large Class VI computational facilities for modeling. SCAN links computers together at remote institutions used by space researchers, utilizing commercially available software for computer-to-computer communications. Started by the NASA's Office of Space Science in mid 1980, SCAN presently contains ten system nodes located at major universities and space research laboratories, with fourteen new nodes projected for the near future. The Stanford University computer gateways allow SCAN users to connect onto the ARPANET and TELENET overseas networks.

  15. An Analysis of Attitudes toward Computer Networks and Internet Addiction.

    Science.gov (United States)

    Tsai, Chin-Chung; Lin, Sunny S. J.

    The purpose of this study was to explore the interplay between young people's attitudes toward computer networks and Internet addiction. After analyzing questionnaire responses of an initial sample of 615 Taiwanese high school students, 78 subjects, viewed as possible Internet addicts, were selected for further explorations. It was found that…

  16. Analysis of Intrusion Detection and Attack Proliferation in Computer Networks

    Science.gov (United States)

    Rangan, Prahalad; Knuth, Kevin H.

    2007-11-01

    One of the popular models to describe computer worm propagation is the Susceptible-Infected (SI) model [1]. This model of worm propagation has been implemented on the simulation toolkit Network Simulator v2 (ns-2) [2]. The ns-2 toolkit has the capability to simulate networks of different topologies. The topology studied in this work, however, is that of a simple star-topology. This work introduces our initial efforts to learn the relevant quantities describing an infection given synthetic data obtained from running the ns-2 worm model. We aim to use Bayesian methods to gain a predictive understanding of how computer infections spread in real world network topologies. This understanding would greatly reinforce dissemination of targeted immunization strategies, which may prevent real-world epidemics. The data consist of reports of infection from a subset of nodes in a large network during an attack. The infection equation obtained from [1] enables us to derive a likelihood function for the infection reports. This prior information can be used in the Bayesian framework to obtain the posterior probabilities for network properties of interest, such as the rate at which nodes contact one another (also referred to as contact rate or scan rate). Our preliminary analyses indicate an effective spread rate of only 1/5th the actual scan rate used for a star-type of topology. This implies that as the population becomes saturated with infected nodes the actual spread rate will become much less than the scan rate used in the simulation.

  17. Computer-communication networks

    CERN Document Server

    Meditch, James S

    1983-01-01

    Computer- Communication Networks presents a collection of articles the focus of which is on the field of modeling, analysis, design, and performance optimization. It discusses the problem of modeling the performance of local area networks under file transfer. It addresses the design of multi-hop, mobile-user radio networks. Some of the topics covered in the book are the distributed packet switching queuing network design, some investigations on communication switching techniques in computer networks and the minimum hop flow assignment and routing subject to an average message delay constraint

  18. Syntactic computations in the language network: Characterising dynamic network properties using representational similarity analysis

    Directory of Open Access Journals (Sweden)

    Lorraine Komisarjevsky Tyler

    2013-05-01

    Full Text Available The core human capacity of syntactic analysis involves a left hemisphere network involving left inferior frontal gyrus (LIFG and posterior middle temporal gyrus (LMTG and the anatomical connections between them. Here we use MEG to determine the spatio-temporal properties of syntactic computations in this network. Listeners heard spoken sentences containing a local syntactic ambiguity (e.g. …landing planes…, at the offset of which they heard a disambiguating verb and decided whether it was an acceptable/unacceptable continuation of the sentence. We charted the time-course of processing and resolving syntactic ambiguity by measuring MEG responses from the onset of each word in the ambiguous phrase and the disambiguating word. We used representational similarity analysis (RSA to characterize syntactic information represented in the LIFG and LpMTG over time and to investigate their relationship to each other. Testing a variety of lexico-syntactic and ambiguity models against the MEG data, our results suggest early lexico-syntactic responses in the LpMTG and later effects of ambiguity in the LIFG, pointing to a clear differentiation in the functional roles of these two regions. Our results suggest the LpMTG represents and transmits lexical information to the LIFG, which responds to and resolves the ambiguity.

  19. Computer networks monitoring

    OpenAIRE

    Antončič , Polona

    2012-01-01

    The present thesis entitled Computer Networks Monitoring introduces the basics of computer networks, the aim and the computer data reclamation from networking devices, software for the system follow-up together with the case of monitoring a real network with tens of network devices. The networks represent an important part in the modern information technology and serve for the exchange of data and sources which makes their impeccability of crucial importance. Correct and efficient sys...

  20. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  1. An Analysis of the Computer and Network Attack Taxonomy

    Science.gov (United States)

    2001-03-01

    method that the Air Force can use to help it classify Internet security attacks and incidents. This researcher concluded that the computer and...organizations responsible for the collection and distribution of Internet Security information, do explicitly collect some, not all, information useful as input into the taxonomy.

  2. The Use of Computer Networks in Data Gathering and Data Analysis.

    Science.gov (United States)

    Yost, Michael; Bremner, Fred

    This document describes the review, analysis, and decision-making process that Trinity University, Texas, went through to develop the three-part computer network that they use to gather and analyze EEG (electroencephalography) and EKG (electrocardiogram) data. The data are gathered in the laboratory on a PDP-1124, an analog minicomputer. Once…

  3. Basics of Computer Networking

    CERN Document Server

    Robertazzi, Thomas

    2012-01-01

    Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

  4. Computation and analysis of temporal betweenness in a knowledge mobilization network.

    Science.gov (United States)

    Afrasiabi Rad, Amir; Flocchini, Paola; Gaudet, Joanne

    2017-01-01

    Highly dynamic social networks, where connectivity continuously changes in time, are becoming more and more pervasive. Knowledge mobilization, which refers to the use of knowledge toward the achievement of goals, is one of the many examples of dynamic social networks. Despite the wide use and extensive study of dynamic networks, their temporal component is often neglected in social network analysis, and statistical measures are usually performed on static network representations. As a result, measures of importance (like betweenness centrality) typically do not reveal the temporal role of the entities involved. Our goal is to contribute to fill this limitation by proposing a form of temporal betweenness measure (foremost betweenness). Our method is analytical as well as experimental: we design an algorithm to compute foremost betweenness, and we apply it to a case study to analyze a knowledge mobilization network. We propose a form of temporal betweenness measure (foremost betweenness) to analyze a knowledge mobilization network and we introduce, for the first time, an algorithm to compute exact foremost betweenness. We then show that this measure, which explicitly takes time into account, allows us to detect centrality roles that were completely hidden in the classical statistical analysis. In particular, we uncover nodes whose static centrality was negligible, but whose temporal role might instead be important to accelerate mobilization flow in the network. We also observe the reverse behavior by detecting nodes with high static centrality, whose role as temporal bridges is instead very low. In this paper, we focus on a form of temporal betweenness designed to detect accelerators in dynamic networks. By revealing potentially important temporal roles, this study is a first step toward a better understanding of the impact of time in social networks and opens the road to further investigation.

  5. Computer network defense system

    Science.gov (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    2017-08-22

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  6. Steady state analysis of Boolean molecular network models via model reduction and computational algebra.

    Science.gov (United States)

    Veliz-Cuba, Alan; Aguilar, Boris; Hinkelmann, Franziska; Laubenbacher, Reinhard

    2014-06-26

    A key problem in the analysis of mathematical models of molecular networks is the determination of their steady states. The present paper addresses this problem for Boolean network models, an increasingly popular modeling paradigm for networks lacking detailed kinetic information. For small models, the problem can be solved by exhaustive enumeration of all state transitions. But for larger models this is not feasible, since the size of the phase space grows exponentially with the dimension of the network. The dimension of published models is growing to over 100, so that efficient methods for steady state determination are essential. Several methods have been proposed for large networks, some of them heuristic. While these methods represent a substantial improvement in scalability over exhaustive enumeration, the problem for large networks is still unsolved in general. This paper presents an algorithm that consists of two main parts. The first is a graph theoretic reduction of the wiring diagram of the network, while preserving all information about steady states. The second part formulates the determination of all steady states of a Boolean network as a problem of finding all solutions to a system of polynomial equations over the finite number system with two elements. This problem can be solved with existing computer algebra software. This algorithm compares favorably with several existing algorithms for steady state determination. One advantage is that it is not heuristic or reliant on sampling, but rather determines algorithmically and exactly all steady states of a Boolean network. The code for the algorithm, as well as the test suite of benchmark networks, is available upon request from the corresponding author. The algorithm presented in this paper reliably determines all steady states of sparse Boolean networks with up to 1000 nodes. The algorithm is effective at analyzing virtually all published models even those of moderate connectivity. The problem for

  7. Hyperswitch Communication Network Computer

    Science.gov (United States)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  8. Mapping university students’ epistemic framing of computational physics using network analysis

    Directory of Open Access Journals (Sweden)

    Madelen Bodin

    2012-04-01

    Full Text Available Solving physics problem in university physics education using a computational approach requires knowledge and skills in several domains, for example, physics, mathematics, programming, and modeling. These competences are in turn related to students’ beliefs about the domains as well as about learning. These knowledge and beliefs components are referred to here as epistemic elements, which together represent the students’ epistemic framing of the situation. The purpose of this study was to investigate university physics students’ epistemic framing when solving and visualizing a physics problem using a particle-spring model system. Students’ epistemic framings are analyzed before and after the task using a network analysis approach on interview transcripts, producing visual representations as epistemic networks. The results show that students change their epistemic framing from a modeling task, with expectancies about learning programming, to a physics task, in which they are challenged to use physics principles and conservation laws in order to troubleshoot and understand their simulations. This implies that the task, even though it is not introducing any new physics, helps the students to develop a more coherent view of the importance of using physics principles in problem solving. The network analysis method used in this study is shown to give intelligible representations of the students’ epistemic framing and is proposed as a useful method of analysis of textual data.

  9. A Multilayer Model of Computer Networks

    OpenAIRE

    Shchurov, Andrey A.

    2015-01-01

    The fundamental concept of applying the system methodology to network analysis declares that network architecture should take into account services and applications which this network provides and supports. This work introduces a formal model of computer networks on the basis of the hierarchical multilayer networks. In turn, individual layers are represented as multiplex networks. The concept of layered networks provides conditions of top-down consistency of the model. Next, we determined the...

  10. Network analysis and data mining in food science: the emergence of computational gastronomy

    Directory of Open Access Journals (Sweden)

    Ahnert Sebastian E

    2013-01-01

    Full Text Available Abstract The rapidly growing body of publicly available data on food chemistry and food usage can be analysed using data mining and network analysis methods. Here we discuss how these approaches can yield new insights both into the sensory perception of food and the anthropology of culinary practice. We also show that this development is part of a larger trend. Over the past two decades large-scale data analysis has revolutionized the biological sciences, which have experienced an explosion of experimental data as a result of the advent of high-throughput technology. Large datasets are also changing research methodologies in the social sciences due to the data generated by mobile communication technology and online social networks. Even the arts and humanities are seeing the establishment of ‘digital humanities’ research centres in order to cope with the increasing digitization of literary and historical sources. We argue that food science is likely to be one of the next beneficiaries of large-scale data analysis, perhaps resulting in fields such as ‘computational gastronomy’.

  11. SCinet Architecture: Featured at the International Conference for High Performance Computing,Networking, Storage and Analysis 2016

    Energy Technology Data Exchange (ETDEWEB)

    Lyonnais, Marc; Smith, Matt; Mace, Kate P.

    2017-02-06

    SCinet is the purpose-built network that operates during the International Conference for High Performance Computing,Networking, Storage and Analysis (Super Computing or SC). Created each year for the conference, SCinet brings to life a high-capacity network that supports applications and experiments that are a hallmark of the SC conference. The network links the convention center to research and commercial networks around the world. This resource serves as a platform for exhibitors to demonstrate the advanced computing resources of their home institutions and elsewhere by supporting a wide variety of applications. Volunteers from academia, government and industry work together to design and deliver the SCinet infrastructure. Industry vendors and carriers donate millions of dollars in equipment and services needed to build and support the local and wide area networks. Planning begins more than a year in advance of each SC conference and culminates in a high intensity installation in the days leading up to the conference. The SCinet architecture for SC16 illustrates a dramatic increase in participation from the vendor community, particularly those that focus on network equipment. Software-Defined Networking (SDN) and Data Center Networking (DCN) are present in nearly all aspects of the design.

  12. Computer Networks and Globalization

    Directory of Open Access Journals (Sweden)

    J. Magliaro

    2007-07-01

    Full Text Available Communication and information computer networks connect the world in ways that make globalization more natural and inequity more subtle. As educators, we look at these phenomena holistically analyzing them from the realist’s view, thus exploring tensions, (in equity and (injustice, and from the idealist’s view, thus embracing connectivity, convergence and development of a collective consciousness. In an increasingly market- driven world we find examples of openness and human generosity that are based on networks, specifically the Internet. After addressing open movements in publishing, software industry and education, we describe the possibility of a dialectic equilibrium between globalization and indigenousness in view of ecologically designed future smart networks

  13. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin Nasaruddin

    2013-09-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  14. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin

    2009-11-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  15. A comparative analysis on computational methods for fitting an ERGM to biological network data

    Directory of Open Access Journals (Sweden)

    Sudipta Saha

    2015-03-01

    Full Text Available Exponential random graph models (ERGM based on graph theory are useful in studying global biological network structure using its local properties. However, computational methods for fitting such models are sensitive to the type, structure and the number of the local features of a network under study. In this paper, we compared computational methods for fitting an ERGM with local features of different types and structures. Two commonly used methods, such as the Markov Chain Monte Carlo Maximum Likelihood Estimation and the Maximum Pseudo Likelihood Estimation are considered for estimating the coefficients of network attributes. We compared the estimates of observed network to our random simulated network using both methods under ERGM. The motivation was to ascertain the extent to which an observed network would deviate from a randomly simulated network if the physical numbers of attributes were approximately same. Cut-off points of some common attributes of interest for different order of nodes were determined through simulations. We implemented our method to a known regulatory network database of Escherichia coli (E. coli.

  16. Analysis of Various Computer System Monitoring and LCD Projector through the Network TCP/IP

    Directory of Open Access Journals (Sweden)

    Santoso Budijono

    2015-09-01

    Full Text Available Many electronic devices have a network connection facility. Projectors today have network facilities to bolster its customer satisfaction in everyday use. By using a device that can be controlled, the expected availability and reliability of the presentation system (computer and projector can be maintained to keep itscondition ready to use for presentation. Nevertheless, there is still a projector device that has no network facilities so that the necessary additional equipment with expensive price. Besides, control equipment in large quantities has problems in timing and the number of technicians in performing controls. This study began with study of literature, from searching for the projectors that has LAN and software to control and finding a number of computer control softwares where the focus is easy to use and affordable. Result of this research is creating asystem which contains suggestions of procurement of computer hardware, hardware and software projectors each of which can be controlled centrally from a distance.

  17. Classification of dried vegetables using computer image analysis and artificial neural networks

    Science.gov (United States)

    Koszela, K.; Łukomski, M.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Zaborowicz, M.; Wojcieszak, D.

    2017-07-01

    In the recent years, there has been a continuously increasing demand for vegetables and dried vegetables. This trend affects the growth of the dehydration industry in Poland helping to exploit excess production. More and more often dried vegetables are used in various sectors of the food industry, both due to their high nutritional qualities and changes in consumers' food preferences. As we observe an increase in consumer awareness regarding a healthy lifestyle and a boom in health food, there is also an increase in the consumption of such food, which means that the production and crop area can increase further. Among the dried vegetables, dried carrots play a strategic role due to their wide application range and high nutritional value. They contain high concentrations of carotene and sugar which is present in the form of crystals. Carrots are also the vegetables which are most often subjected to a wide range of dehydration processes; this makes it difficult to perform a reliable qualitative assessment and classification of this dried product. The many qualitative properties of dried carrots determining their positive or negative quality assessment include colour and shape. The aim of the research was to develop and implement the model of a computer system for the recognition and classification of freeze-dried, convection-dried and microwave vacuum dried products using the methods of computer image analysis and artificial neural networks.

  18. Consensus computational network analysis for identifying candidate outer membrane proteins from Borrelia spirochetes.

    Science.gov (United States)

    Kenedy, Melisha R; Scott, Edgar J; Shrestha, Binu; Anand, Arvind; Iqbal, Henna; Radolf, Justin D; Dyer, David W; Akins, Darrin R

    2016-07-11

    Similar to Gram-negative organisms, Borrelia spirochetes are dual-membrane organisms with both an inner and outer membrane. Although the outer membrane contains integral membrane proteins, few of the borrelial outer membrane proteins (OMPs) have been identified and characterized to date. Therefore, we utilized a consensus computational network analysis to identify novel borrelial OMPs. Using a series of computer-based algorithms, we selected all protein-encoding sequences predicted to be OM-localized and/or to form β-barrels in the borrelial OM. Using this system, we identified 41 potential OMPs from B. burgdorferi and characterized three (BB0838, BB0405, and BB0406) to confirm that our computer-based methodology did, in fact, identify borrelial OMPs. Triton X-114 phase partitioning revealed that BB0838 is found in the detergent phase, which would be expected of a membrane protein. Proteolysis assays indicate that BB0838 is partially sensitive to both proteinase K and trypsin, further indicating that BB0838 is surface-exposed. Consistent with a prior study, we also confirmed that BB0405 is surface-exposed and associates with the borrelial OM. Furthermore, we have shown that BB0406, the product of a co-transcribed downstream gene, also encodes a novel, previously uncharacterized borrelial OMP. Interestingly, while BB0406 has several physicochemical properties consistent with it being an OMP, it was found to be resistant to surface proteolysis. Consistent with BB0405 and BB0406 being OMPs, both were found to be capable of incorporating into liposomes and exhibit pore-forming activity, suggesting that both proteins are porins. Lastly, we expanded our computational analysis to identify OMPs from other borrelial organisms, including both Lyme disease and relapsing fever spirochetes. Using a consensus computer algorithm, we generated a list of candidate OMPs for both Lyme disease and relapsing fever spirochetes and determined that three of the predicted B. burgdorferi

  19. Evolutionary Game Analysis of Competitive Information Dissemination on Social Networks: An Agent-Based Computational Approach

    Directory of Open Access Journals (Sweden)

    Qing Sun

    2015-01-01

    Full Text Available Social networks are formed by individuals, in which personalities, utility functions, and interaction rules are made as close to reality as possible. Taking the competitive product-related information as a case, we proposed a game-theoretic model for competitive information dissemination in social networks. The model is presented to explain how human factors impact competitive information dissemination which is described as the dynamic of a coordination game and players’ payoff is defined by a utility function. Then we design a computational system that integrates the agent, the evolutionary game, and the social network. The approach can help to visualize the evolution of % of competitive information adoption and diffusion, grasp the dynamic evolution features in information adoption game over time, and explore microlevel interactions among users in different network structure under various scenarios. We discuss several scenarios to analyze the influence of several factors on the dissemination of competitive information, ranging from personality of individuals to structure of networks.

  20. Computational analysis of cartilage implants based on an interpenetrated polymer network for tissue repairing.

    Science.gov (United States)

    Manzano, Sara; Poveda-Reyes, Sara; Ferrer, Gloria Gallego; Ochoa, Ignacio; Hamdy Doweidar, Mohamed

    2014-10-01

    Interpenetrated polymer networks (IPNs), composed by two independent polymeric networks that spatially interpenetrate, are considered as valuable systems to control permeability and mechanical properties of hydrogels for biomedical applications. Specifically, poly(ethyl acrylate) (PEA)-poly(2-hydroxyethyl acrylate) (PHEA) IPNs have been explored as good hydrogels for mimicking articular cartilage. These lattices are proposed as matrix implants in cartilage damaged areas to avoid the discontinuity in flow uptake preventing its deterioration. The permeability of these implants is a key parameter that influences their success, by affecting oxygen and nutrient transport and removing cellular waste products to healthy cartilage. Experimental try-and-error approaches are mostly used to optimize the composition of such structures. However, computational simulation may offer a more exhaustive tool to test and screen out biomaterials mimicking cartilage, avoiding expensive and time-consuming experimental tests. An accurate and efficient prediction of material's permeability and internal directionality and magnitude of the fluid flow could be highly useful when optimizing biomaterials design processes. Here we present a 3D computational model based on Sussman-Bathe hyperelastic material behaviour. A fluid structure analysis is performed with ADINA software, considering these materials as two phases composites where the solid part is saturated by the fluid. The model is able to simulate the behaviour of three non-biodegradable hydrogel compositions, where percentages of PEA and PHEA are varied. Specifically, the aim of this study is (i) to verify the validity of the Sussman-Bathe material model to simulate the response of the PEA-PHEA biomaterials; (ii) to predict the fluid flux and the permeability of the proposed IPN hydrogels and (iii) to study the material domains where the passage of nutrients and cellular waste products is reduced leading to an inadequate flux

  1. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  2. Performance analysis and acceleration of explicit integration for large kinetic networks using batched GPU computations

    Energy Technology Data Exchange (ETDEWEB)

    Shyles, Daniel [University of Tennessee (UT); Dongarra, Jack J. [University of Tennessee, Knoxville (UTK); Guidry, Mike W. [ORNL; Tomov, Stanimire Z. [ORNL; Billings, Jay Jay [ORNL; Brock, Benjamin A. [ORNL; Haidar Ahmad, Azzam A. [ORNL

    2016-09-01

    Abstract—We demonstrate the systematic implementation of recently-developed fast explicit kinetic integration algorithms that solve efficiently N coupled ordinary differential equations (subject to initial conditions) on modern GPUs. We take representative test cases (Type Ia supernova explosions) and demonstrate two or more orders of magnitude increase in efficiency for solving such systems (of realistic thermonuclear networks coupled to fluid dynamics). This implies that important coupled, multiphysics problems in various scientific and technical disciplines that were intractable, or could be simulated only with highly schematic kinetic networks, are now computationally feasible. As examples of such applications we present the computational techniques developed for our ongoing deployment of these new methods on modern GPU accelerators. We show that similarly to many other scientific applications, ranging from national security to medical advances, the computation can be split into many independent computational tasks, each of relatively small-size. As the size of each individual task does not provide sufficient parallelism for the underlying hardware, especially for accelerators, these tasks must be computed concurrently as a single routine, that we call batched routine, in order to saturate the hardware with enough work.

  3. Markov Networks in Evolutionary Computation

    CERN Document Server

    Shakya, Siddhartha

    2012-01-01

    Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs).  EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current researc...

  4. An analysis of the structure and evolution of the scientific collaboration network of computer intelligence in games

    Science.gov (United States)

    Lara-Cabrera, R.; Cotta, C.; Fernández-Leiva, A. J.

    2014-02-01

    Games constitute a research domain that is attracting the interest of scientists from numerous disciplines. This is particularly true from the perspective of computational intelligence. In order to examine the growing importance of this area in the gaming domain, we present an analysis of the scientific collaboration network of researchers working on computational intelligence in games (CIG). This network has been constructed from bibliographical data obtained from the Digital Bibliography & Library Project (DBLP). We have analyzed from a temporal perspective several properties of the CIG network at the macroscopic, mesoscopic and microscopic levels, studying the large-scale structure, the growth mechanics, and collaboration patterns among other features. Overall, computational intelligence in games exhibits similarities with other collaboration networks such as for example a log-normal degree distribution and sub-linear preferential attachment for new authors. It also has distinctive features, e.g. the number of papers co-authored is exponentially distributed, the internal preferential attachment (new collaborations among existing authors) is linear, and fidelity rates (measured as the relative preference for publishing with previous collaborators) grow super-linearly. The macroscopic and mesoscopic evolution of the network indicates the field is very active and vibrant, but it is still at an early developmental stage. We have also analyzed communities and central nodes and how these are reflected in research topics, thus identifying active research subareas.

  5. A Comparative Experimental Design and Performance Analysis of Snort-Based Intrusion Detection System in Practical Computer Networks

    Directory of Open Access Journals (Sweden)

    Imdadul Karim

    2017-02-01

    Full Text Available As one of the most reliable technologies, network intrusion detection system (NIDS allows the monitoring of incoming and outgoing traffic to identify unauthorised usage and mishandling of attackers in computer network systems. To this extent, this paper investigates the experimental performance of Snort-based NIDS (S-NIDS in a practical network with the latest technology in various network scenarios including high data speed and/or heavy traffic and/or large packet size. An effective testbed is designed based on Snort using different muti-core processors, e.g., i5 and i7, with different operating systems, e.g., Windows 7, Windows Server and Linux. Furthermore, considering an enterprise network consisting of multiple virtual local area networks (VLANs, a centralised parallel S-NIDS (CPS-NIDS is proposed with the support of a centralised database server to deal with high data speed and heavy traffic. Experimental evaluation is carried out for each network configuration to evaluate the performance of the S-NIDS in different network scenarios as well as validating the effectiveness of the proposed CPS-NIDS. In particular, by analysing packet analysis efficiency, an improved performance of up to 10% is shown to be achieved with Linux over other operating systems, while up to 8% of improved performance can be achieved with i7 over i5 processors.

  6. Conceptual metaphors in computer networking terminology ...

    African Journals Online (AJOL)

    Lakoff & Johnson, 1980) is used as a basic framework for analysing and explaining the occurrence of metaphor in the terminology used by computer networking professionals in the information technology (IT) industry. An analysis of linguistic ...

  7. Computing and networking at JINR

    CERN Document Server

    Zaikin, N S; Strizh, T A

    2001-01-01

    This paper describes the computing and networking facilities at the Joint Institute for Nuclear Research. The Joint Institute for Nuclear Research (JINR) is an international intergovernmental organization located in Dubna, a small town on the bank of the Volga river 120 km north from Moscow. At present JINR has 18 Member States. The Institute consists of 7 scientific Laboratories and some subdivisions. JINR has scientific cooperation with such scientific centres as CERN, FNAL, DESY etc. and is equipped with the powerful and fast computation means integrated into the worldwide computer networks. The Laboratory of Information Technologies (LIT) is responsible for Computing and Networking at JINR. (5 refs).

  8. Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Chen Jiun-Ching

    2007-05-01

    Full Text Available Abstract Background Genome-wide identification of specific oligonucleotides (oligos is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos. Results We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes. Conclusion The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through

  9. Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis.

    Science.gov (United States)

    Liu, Chun-Chi; Lin, Chin-Chung; Li, Ker-Chau; Chen, Wen-Shyen E; Chen, Jiun-Ching; Yang, Ming-Te; Yang, Pan-Chyr; Chang, Pei-Chun; Chen, Jeremy J W

    2007-05-22

    Genome-wide identification of specific oligonucleotides (oligos) is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN) is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos. We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB) algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes. The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through polymerase chain reaction experiments. Specific

  10. Experimental and computational methods for the analysis and modeling of signaling networks.

    Science.gov (United States)

    Gherardini, Pier Federico; Helmer-Citterich, Manuela

    2013-03-25

    External cues are processed and integrated by signal transduction networks that drive appropriate cellular responses. Characterizing these programs, as well as how their deregulation leads to disease, is crucial for our understanding of cell biology. The past ten years have witnessed a gradual increase in the number of molecular parameters that can be simultaneously measured in a sample. Moreover our capacity to handle multiple samples in parallel has expanded, thus allowing a deeper profiling of cellular states under diverse experimental conditions. These technological advances have been complemented by the development of computational methods aimed at mining, analyzing and modeling these data. In this review we give a general overview of the most important experimental and computational techniques used in the field and describe several interesting application of these methodologies. We conclude by highlighting the issues that we think will keep researchers in the field busy in the next few years. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Integrating network awareness in ATLAS distributed computing

    CERN Document Server

    De, K; The ATLAS collaboration; Klimentov, A; Maeno, T; Mckee, S; Nilsson, P; Petrosyan, A; Vukotic, I; Wenaus, T

    2014-01-01

    A crucial contributor to the success of the massively scaled global computing system that delivers the analysis needs of the LHC experiments is the networking infrastructure upon which the system is built. The experiments have been able to exploit excellent high-bandwidth networking in adapting their computing models for the most efficient utilization of resources. New advanced networking technologies now becoming available such as software defined networks hold the potential of further leveraging the network to optimize workflows and dataflows, through proactive control of the network fabric on the part of high level applications such as experiment workload management and data management systems. End to end monitoring of networking and data flow performance further allows applications to adapt based on real time conditions. We will describe efforts underway in ATLAS on integrating network awareness at the application level, particularly in workload management.

  12. Administration of remote computer networks

    OpenAIRE

    Fjeldbo, Stig Jarle

    2005-01-01

    Master i nettverks- og systemadministrasjon Today's computer networks have gone from typically being a small local area network, to wide area networks, where users and servers are interconnected with each other from all over the world. This development has gradually expanded as bandwidth has become higher and cheaper. But when dealing with the network traffic, bandwidth is only one of the important properties. Delay, jitter and reliability are also important properties for t...

  13. Understanding and designing computer networks

    CERN Document Server

    King, Graham

    1995-01-01

    Understanding and Designing Computer Networks considers the ubiquitous nature of data networks, with particular reference to internetworking and the efficient management of all aspects of networked integrated data systems. In addition it looks at the next phase of networking developments; efficiency and security are covered in the sections dealing with data compression and data encryption; and future examples of network operations, such as network parallelism, are introduced.A comprehensive case study is used throughout the text to apply and illustrate new techniques and concepts as th

  14. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  15. Architecture of the rat nephron-arterial network: analysis with micro-computed tomography.

    Science.gov (United States)

    Marsh, Donald J; Postnov, Dmitry D; Rowland, Douglas J; Wexler, Anthony S; Sosnovtseva, Olga V; Holstein-Rathlou, Niels-Henrik

    2017-08-01

    Among solid organs, the kidney's vascular network stands out, because each nephron has two distinct capillary structures in series and because tubuloglomerular feedback, one of the mechanisms responsible for blood flow autoregulation, is specific to renal tubules. Tubuloglomerular feedback and the myogenic mechanism, acting jointly, autoregulate single-nephron blood flow. Each generates a self-sustained periodic oscillation and an oscillating electrical signal that propagates upstream along arterioles. Similar electrical signals from other nephrons interact, allowing nephron synchronization. Experimental measurements show synchronization over fields of a few nephrons; simulations based on a simplified network structure that could obscure complex interactions predict more widespread synchronization. To permit more realistic simulations, we made a cast of blood vessels in a rat kidney, performed micro-computed tomography at 2.5-μm resolution, and recorded three-dimensional coordinates of arteries, afferent arterioles, and glomeruli. Nonterminal branches of arcuate arteries form treelike structures requiring two to six bifurcations to reach terminal branches at the tree tops. Terminal arterial structures were either paired branches at the tops of the arterial trees, from which 52.6% of all afferent arterioles originated, or unpaired arteries not at the tree tops, yielding the other 22.9%; the other 24.5% originated directly from nonterminal arteries. Afferent arterioles near the corticomedullary boundary were longer than those farther away, suggesting that juxtamedullary nephrons have longer afferent arterioles. The distance separating origins of pairs of afferent arterioles varied randomly. The results suggest an irregular-network tree structure with vascular nodes, where arteriolar activity and local blood pressure interact. Copyright © 2017 the American Physiological Society.

  16. Risks in Networked Computer Systems

    OpenAIRE

    Klingsheim, André N.

    2008-01-01

    Networked computer systems yield great value to businesses and governments, but also create risks. The eight papers in this thesis highlight vulnerabilities in computer systems that lead to security and privacy risks. A broad range of systems is discussed in this thesis: Norwegian online banking systems, the Norwegian Automated Teller Machine (ATM) system during the 90's, mobile phones, web applications, and wireless networks. One paper also comments on legal risks to bank cust...

  17. Wireless Computational Networking Architectures

    Science.gov (United States)

    2013-12-01

    2] T. Ho, M. Medard, R. Kotter , D. Karger, M. Effros, J. Shi, and B. Leong, “A Random Linear Network Coding Approach to Multicast,” IEEE...218, January 2008. [10] R. Kotter and F. R. Kschischang, “Coding for Errors and Erasures in Random Network Coding,” IEEE Transactions on...Systems, Johns Hopkins University, Baltimore, Maryland, 2011. 6. B. W. Suter and Z. Yan U.S. Patent Pending 13/949,319 Rank Deficient Decoding

  18. Computing with Spiking Neuron Networks

    NARCIS (Netherlands)

    H. Paugam-Moisy; S.M. Bohte (Sander); G. Rozenberg; T.H.W. Baeck (Thomas); J.N. Kok (Joost)

    2012-01-01

    htmlabstractAbstract Spiking Neuron Networks (SNNs) are often referred to as the 3rd gener- ation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an ac- curate modeling of synaptic interactions

  19. Social networks a framework of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2014-01-01

    This volume provides the audience with an updated, in-depth and highly coherent material on the conceptually appealing and practically sound information technology of Computational Intelligence applied to the analysis, synthesis and evaluation of social networks. The volume involves studies devoted to key issues of social networks including community structure detection in networks, online social networks, knowledge growth and evaluation, and diversity of collaboration mechanisms.  The book engages a wealth of methods of Computational Intelligence along with well-known techniques of linear programming, Formal Concept Analysis, machine learning, and agent modeling.  Human-centricity is of paramount relevance and this facet manifests in many ways including personalized semantics, trust metric, and personal knowledge management; just to highlight a few of these aspects. The contributors to this volume report on various essential applications including cyber attacks detection, building enterprise social network...

  20. Analysis of multiuser mixed RF/FSO relay networks for performance improvements in Cloud Computing-Based Radio Access Networks (CC-RANs)

    Science.gov (United States)

    Alimi, Isiaka A.; Monteiro, Paulo P.; Teixeira, António L.

    2017-11-01

    The key paths toward the fifth generation (5G) network requirements are towards centralized processing and small-cell densification systems that are implemented on the cloud computing-based radio access networks (CC-RANs). The increasing recognitions of the CC-RANs can be attributed to their valuable features regarding system performance optimization and cost-effectiveness. Nevertheless, realization of the stringent requirements of the fronthaul that connects the network elements is highly demanding. In this paper, considering the small-cell network architectures, we present multiuser mixed radio-frequency/free-space optical (RF/FSO) relay networks as feasible technologies for the alleviation of the stringent requirements in the CC-RANs. In this study, we use the end-to-end (e2e) outage probability, average symbol error probability (ASEP), and ergodic channel capacity as the performance metrics in our analysis. Simulation results show the suitability of deployment of mixed RF/FSO schemes in the real-life scenarios.

  1. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  2. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  3. Data Logistics in Network Computing

    CERN Multimedia

    CERN. Geneva; Marquina, Miguel Angel

    2005-01-01

    In distributed computing environments, performance is often dominated by the time that it takes to move data over a network. In the case of data-centric applications, or Data Grids, this problem of data movement becomes one of the overriding concerns. This talk describes techniques for improving data movement in Grid environments that we refer to as 'logistics.' We demonstrate that by using storage and cooperative forwarding 'in' the network, we can improve end to end throughput in many cases. Our approach offers clear performance benefits for high-bandwidth, high-latency networks. This talk will introduce the Logistical Session Layer (LSL) and provide experimental results from that system.

  4. Collective network for computer structures

    Energy Technology Data Exchange (ETDEWEB)

    Blumrich, Matthias A [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Chen, Dong [Croton On Hudson, NY; Gara, Alan [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Takken, Todd E [Brewster, NY; Steinmacher-Burow, Burkhard D [Wernau, DE; Vranas, Pavlos M [Bedford Hills, NY

    2011-08-16

    A system and method for enabling high-speed, low-latency global collective communications among interconnected processing nodes. The global collective network optimally enables collective reduction operations to be performed during parallel algorithm operations executing in a computer structure having a plurality of the interconnected processing nodes. Router devices ate included that interconnect the nodes of the network via links to facilitate performance of low-latency global processing operations at nodes of the virtual network and class structures. The global collective network may be configured to provide global barrier and interrupt functionality in asynchronous or synchronized manner. When implemented in a massively-parallel supercomputing structure, the global collective network is physically and logically partitionable according to needs of a processing algorithm.

  5. Optimal monitoring of computer networks

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.V.; Flanagan, D.

    1997-08-01

    The authors apply the ideas from optimal design theory to the very specific area of monitoring large computer networks. The behavior of these networks is so complex and uncertain that it is quite natural to use the statistical methods of experimental design which were originated in such areas as biology, behavioral sciences and agriculture, where the random character of phenomena is a crucial component and systems are too complicated to be described by some sophisticated deterministic models. They want to emphasize that only the first steps have been completed, and relatively simple underlying concepts about network functions have been used. Their immediate goal is to initiate studies focused on developing efficient experimental design techniques which can be used by practitioners working with large networks operating and evolving in a random environment.

  6. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  7. Computational analysis of complex systems: Applications to population dynamics and networks

    Science.gov (United States)

    Molnar, Ferenc

    In most complex evolving systems, we can often find a critical subset of the constituents that can initiate a global change in the entire system. For example, in complex networks, a critical subset of nodes can efficiently spread information, influence, or control dynamical processes over the entire network. Similarly, in nonlinear dynamics, we can locate key variables, or find the necessary parameters, to reach the attraction basin of a desired global state. In both cases, a fundamental goal is finding the ability to efficiently control these systems. We study two distinct complex systems in this dissertation, exploring these topics. First, we analyze a population dynamics model describing interactions of sex-structured population groups. Specifically, we analyze how a sex-linked genetic trait's ecological consequence (population survival or extinction) can be influenced by the presence of sex-specific cultural mortality traits, motivated by the desire to expand the theoretical understanding of the role of biased sex ratios in organisms. We analyze dynamics within a single population group, as well as between competing groups. We find that there is a finite range of sex ratio bias that can be maintained in stable equilibrium by sex-specific mortalities. We also find that the outcome of an invasion and the ensuing between-group competition depends not on larger equilibrium group densities, but on the higher allocation of sex-ratio genes. When we extend the model with diffusive dispersal, we find that a critical patch size for achieving positive growth only exists if the population expands into an empty environment. If a resident population is already present that can be exploited by the invading group, then any small seed of invader can advance from rarity, in the mean-field approximation, as long as the local competition dynamics favors the invader's survival. Most spatial models assume initial populations with a uniform distribution inside a finite patch; a

  8. Social Network Analysis with sna

    Directory of Open Access Journals (Sweden)

    Carter T. Butts

    2007-12-01

    Full Text Available Modern social network analysis---the analysis of relational data arising from social systems---is a computationally intensive area of research. Here, we provide an overview of a software package which provides support for a range of network analytic functionality within the R statistical computing environment. General categories of currently supported functionality are described, and brief examples of package syntax and usage are shown.

  9. COMPUTATIONAL ANALYSIS BASED ON ARTIFICIAL NEURAL NETWORKS FOR AIDING IN DIAGNOSING OSTEOARTHRITIS OF THE LUMBAR SPINE.

    Science.gov (United States)

    Veronezi, Carlos Cassiano Denipotti; de Azevedo Simões, Priscyla Waleska Targino; Dos Santos, Robson Luiz; da Rocha, Edroaldo Lummertz; Meláo, Suelen; de Mattos, Merisandra Côrtes; Cechinel, Cristian

    2011-01-01

    To ascertain the advantages of applying artificial neural networks to recognize patterns on lumbar spine radiographies in order to aid in the process of diagnosing primary osteoarthritis. This was a cross-sectional descriptive analytical study with a quantitative approach and an emphasis on diagnosis. The training set was composed of images collected between January and July 2009 from patients who had undergone lateral-view digital radiographies of the lumbar spine, which were provided by a radiology clinic located in the municipality of Criciúma (SC). Out of the total of 260 images gathered, those with distortions, those presenting pathological conditions that altered the architecture of the lumbar spine and those with patterns that were difficult to characterize were discarded, resulting in 206 images. The image data base (n = 206) was then subdivided, resulting in 68 radiographies for the training stage, 68 images for tests and 70 for validation. A hybrid neural network based on Kohonen self-organizing maps and on Multilayer Perceptron networks was used. After 90 cycles, the validation was carried out on the best results, achieving accuracy of 62.85%, sensitivity of 65.71% and specificity of 60%. Even though the effectiveness shown was moderate, this study is still innovative. The values show that the technique used has a promising future, pointing towards further studies on image and cycle processing methodology with a larger quantity of radiographies.

  10. Experimental and computational tools for analysis of signaling networks in primary cells

    DEFF Research Database (Denmark)

    Schoof, Erwin M; Linding, Rune

    2014-01-01

    , or differentiation. Protein phosphorylation events play a major role in this process and are often involved in fundamental biological and cellular processes such as protein-protein interactions, enzyme activity, and immune responses. Determining which kinases phosphorylate specific phospho sites poses a challenge......; this information is critical when trying to elucidate key proteins involved in specific cellular responses. Here, methods to generate high-quality quantitative phosphorylation data from cell lysates originating from primary cells, and how to analyze the generated data to construct quantitative signaling network...

  11. Computational Analysis of Photovoltaic Power Generation System for Optical Network Unit

    Science.gov (United States)

    Wakao, Shinji; Kusakabe, Taketoshi; Ohashi, Yoshiyuki; Suzuki, Naoyuki; Horii, Hirofumi

    We have developed a novel computational simulator for a photovoltaic (PV) power generation system as an efficient design tool. The developed simulator enables us to predict the complicated performance of the whole PV system with high accuracy at short time intervals and evaluate the system quantitatively. In this paper, using the developed simulator, we quantitatively estimate the changes of PV current, battery voltage, battery discharging current, commercial supply current, and the period of practicable stand-alone operation, etc., according to the reduction of PV/battery capacity. To confirm the validity of the developed simulator, we compare the calculated values with measured ones. The computational results precisely agree with the measured values under various PV/battery capacity conditions. Furthermore, altering PV/battery capacity in the simulation, we successfully predict the system performance, e.g., the period of practicable stand-alone operation, and evaluate the PV system parameters.

  12. AUTOMATIC CONTROL OF INTELLECTUAL RIGHTS IN THE GLOBAL COMPUTER NETWORKS

    OpenAIRE

    Anatoly P. Yakimaho; Victoriya V. Bessarabova

    2013-01-01

    The problems of use of subjects of intellectual property in the global computer networks are stated. The main attention is focused on the ways of problems solutions arising during the work in computer networks. Legal problems of information society are considered. The analysis of global computer networks as places for the organization of collective management by copyrights in the world scale is carried out. Issues of creation of a system of automatic control of property rights of authors and ...

  13. Reverse engineering and analysis of genome-wide gene regulatory networks from gene expression profiles using high-performance computing.

    Science.gov (United States)

    Belcastro, Vincenzo; Gregoretti, Francesco; Siciliano, Velia; Santoro, Michele; D'Angelo, Giovanni; Oliva, Gennaro; di Bernardo, Diego

    2012-01-01

    Regulation of gene expression is a carefully regulated phenomenon in the cell. “Reverse-engineering” algorithms try to reconstruct the regulatory interactions among genes from genome-scale measurements of gene expression profiles (microarrays). Mammalian cells express tens of thousands of genes; hence, hundreds of gene expression profiles are necessary in order to have acceptable statistical evidence of interactions between genes. As the number of profiles to be analyzed increases, so do computational costs and memory requirements. In this work, we designed and developed a parallel computing algorithm to reverse-engineer genome-scale gene regulatory networks from thousands of gene expression profiles. The algorithm is based on computing pairwise Mutual Information between each gene-pair. We successfully tested it to reverse engineer the Mus Musculus (mouse) gene regulatory network in liver from gene expression profiles collected from a public repository. A parallel hierarchical clustering algorithm was implemented to discover “communities” within the gene network. Network communities are enriched for genes involved in the same biological functions. The inferred network was used to identify two mitochondrial proteins.

  14. Multidimensional Analysis of Linguistic Networks

    Science.gov (United States)

    Araújo, Tanya; Banisch, Sven

    Network-based approaches play an increasingly important role in the analysis of data even in systems in which a network representation is not immediately apparent. This is particularly true for linguistic networks, which use to be induced from a linguistic data set for which a network perspective is only one out of several options for representation. Here we introduce a multidimensional framework for network construction and analysis with special focus on linguistic networks. Such a framework is used to show that the higher is the abstraction level of network induction, the harder is the interpretation of the topological indicators used in network analysis. Several examples are provided allowing for the comparison of different linguistic networks as well as to networks in other fields of application of network theory. The computation and the intelligibility of some statistical indicators frequently used in linguistic networks are discussed. It suggests that the field of linguistic networks, by applying statistical tools inspired by network studies in other domains, may, in its current state, have only a limited contribution to the development of linguistic theory.

  15. Introduction to Social Network Analysis

    Science.gov (United States)

    Zaphiris, Panayiotis; Ang, Chee Siang

    Social Network analysis focuses on patterns of relations between and among people, organizations, states, etc. It aims to describe networks of relations as fully as possible, identify prominent patterns in such networks, trace the flow of information through them, and discover what effects these relations and networks have on people and organizations. Social network analysis offers a very promising potential for analyzing human-human interactions in online communities (discussion boards, newsgroups, virtual organizations). This Tutorial provides an overview of this analytic technique and demonstrates how it can be used in Human Computer Interaction (HCI) research and practice, focusing especially on Computer Mediated Communication (CMC). This topic acquires particular importance these days, with the increasing popularity of social networking websites (e.g., youtube, myspace, MMORPGs etc.) and the research interest in studying them.

  16. Positioning of Tacrolimus for the Treatment of Diabetic Nephropathy Based on Computational Network Analysis.

    Science.gov (United States)

    Aschauer, Constantin; Perco, Paul; Heinzel, Andreas; Sunzenauer, Judith; Oberbauer, Rainer

    2017-01-01

    To evaluate tacrolimus as therapeutic option for diabetic nephropathy (DN) based on molecular profile and network-based molecular model comparisons. We generated molecular models representing pathophysiological mechanisms of DN and tacrolimus mechanism of action (MoA) based on literature derived data and transcriptomics datasets. Shared enriched molecular pathways were identified based on both model datasets. A newly generated transcriptomics dataset studying the effect of tacrolimus on mesangial cells in vitro was added to identify mechanisms in DN pathophysiology. We searched for features in interference between the DN molecular model and the tacrolimus MoA molecular model already holding annotation evidence as diagnostic or prognostic biomarker in the context of DN. Thirty nine molecular features were shared between the DN molecular model, holding 252 molecular features and the tacrolimus MoA molecular model, holding 209 molecular features, with six additional molecular features affected by tacrolimus in mesangial cells. Significantly affected molecular pathways by both molecular model sets included cytokine-cytokine receptor interactions, adherens junctions, TGF-beta signaling, MAPK signaling, and calcium signaling. Molecular features involved in inflammation and immune response contributing to DN progression were significantly downregulated by tacrolimus (e.g. the tumor necrosis factor alpha (TNF), interleukin 4, or interleukin 10). On the other hand, pro-fibrotic stimuli being detrimental to renal function were induced by tacrolimus like the transforming growth factor beta 1 (TGFB1), endothelin 1 (EDN1), or type IV collagen alpha 1 (COL4A1). Patients with DN and elevated TNF levels might benefit from tacrolimus treatment regarding maintaining GFR and reducing inflammation. TGFB1 and EDN1 are proposed as monitoring markers to assess degree of renal damage. Next to this stratification approach, the use of drug combinations consisting of tacrolimus in addition

  17. Computational identification of surrogate genes for prostate cancer phases using machine learning and molecular network analysis.

    Science.gov (United States)

    Li, Rudong; Dong, Xiao; Ma, Chengcheng; Liu, Lei

    2014-08-23

    Prostate cancer is one of the most common malignant diseases and is characterized by heterogeneity in the clinical course. To date, there are no efficient morphologic features or genomic biomarkers that can characterize the phenotypes of the cancer, especially with regard to metastasis--the most adverse outcome. Searching for effective surrogate genes out of large quantities of gene expression data is a key to cancer phenotyping and/or understanding molecular mechanisms underlying prostate cancer development. Using the maximum relevance minimum redundancy (mRMR) method on microarray data from normal tissues, primary tumors and metastatic tumors, we identifed four genes that can optimally classify samples of different prostate cancer phases. Moreover, we constructed a molecular interaction network with existing bioinformatic resources and co-identifed eight genes on the shortest-paths among the mRMR-identified genes, which are potential co-acting factors of prostate cancer. Functional analyses show that molecular functions involved in cell communication, hormone-receptor mediated signaling, and transcription regulation play important roles in the development of prostate cancer. We conclude that the surrogate genes we have selected compose an effective classifier of prostate cancer phases, which corresponds to a minimum characterization of cancer phenotypes on the molecular level. Along with their molecular interaction partners, it is fairly to assume that these genes may have important roles in prostate cancer development; particularly, the un-reported genes may bring new insights for the understanding of the molecular mechanisms. Thus our results may serve as a candidate gene set for further functional studies.

  18. Personal computer local networks report

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. Since the first microcomputer local networks of the late 1970's and early 80's, personal computer LANs have expanded in popularity, especially since the introduction of IBMs first PC in 1981. The late 1980s has seen a maturing in the industry with only a few vendors maintaining a large share of the market. This report is intended to give the reader a thorough understanding of the technology used to build these systems ... from cable to chips ... to ... protocols to servers. The report also fully defines PC LANs and the marketplace, with in-

  19. Terminal-oriented computer-communication networks.

    Science.gov (United States)

    Schwartz, M.; Boorstyn, R. R.; Pickholtz, R. L.

    1972-01-01

    Four examples of currently operating computer-communication networks are described in this tutorial paper. They include the TYMNET network, the GE Information Services network, the NASDAQ over-the-counter stock-quotation system, and the Computer Sciences Infonet. These networks all use programmable concentrators for combining a multiplicity of terminals. Included in the discussion for each network is a description of the overall network structure, the handling and transmission of messages, communication requirements, routing and reliability consideration where applicable, operating data and design specifications where available, and unique design features in the area of computer communications.

  20. Analysis of a Model for Computer Virus Transmission

    OpenAIRE

    Qin, Peng

    2015-01-01

    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  1. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  2. Computer Network Defense Through Radial Wave Functions

    OpenAIRE

    Malloy, Ian

    2016-01-01

    The purpose of this research was to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has be...

  3. Statistical analysis and definition of blockages-prediction formulae for the wastewater network of Oslo by evolutionary computing.

    Science.gov (United States)

    Ugarelli, Rita; Kristensen, Stig Morten; Røstum, Jon; Saegrov, Sveinung; Di Federico, Vittorio

    2009-01-01

    Oslo Vann og Avløpsetaten (Oslo VAV)-the water/wastewater utility in the Norwegian capital city of Oslo-is assessing future strategies for selection of most reliable materials for wastewater networks, taking into account not only material technical performance but also material performance, regarding operational condition of the system.The research project undertaken by SINTEF Group, the largest research organisation in Scandinavia, NTNU (Norges Teknisk-Naturvitenskapelige Universitet) and Oslo VAV adopts several approaches to understand reasons for failures that may impact flow capacity, by analysing historical data for blockages in Oslo.The aim of the study was to understand whether there is a relationship between the performance of the pipeline and a number of specific attributes such as age, material, diameter, to name a few. This paper presents the characteristics of the data set available and discusses the results obtained by performing two different approaches: a traditional statistical analysis by segregating the pipes into classes, each of which with the same explanatory variables, and a Evolutionary Polynomial Regression model (EPR), developed by Technical University of Bari and University of Exeter, to identify possible influence of pipe's attributes on the total amount of predicted blockages in a period of time.Starting from a detailed analysis of the available data for the blockage events, the most important variables are identified and a classification scheme is adopted.From the statistical analysis, it can be stated that age, size and function do seem to have a marked influence on the proneness of a pipeline to blockages, but, for the reduced sample available, it is difficult to say which variable it is more influencing. If we look at total number of blockages the oldest class seems to be the most prone to blockages, but looking at blockage rates (number of blockages per km per year), then it is the youngest class showing the highest blockage rate

  4. Computer network and knowledge sharing. Computer network to chishiki kyoyu

    Energy Technology Data Exchange (ETDEWEB)

    Yoshimura, S. (The University of Tokyo, Tokyo (Japan))

    1991-10-20

    The infomation system has changed from the on-line data base as a simple knowledge sharing, used in the times when devices were expensive, to dialogue type approaches as a result of TSS advancement. This paper describes the advantages in and methods of utilizing personal computer communications from the standpoint of a person engaged in chemistry education. The electronic mail has a number of advatages; you can reach a person as immediately as in the telephone but need not to interrupt the receiver primes work, you can get to it more easily than writing a letter. Particularly the electronic signboard has a large living know-how effect that ''someone who happens to know it can answer''. The Japan Chemical Society has opened the ''Square of Chemistry'' in the NIFTY Serve. Although the Society provides information, it is important that the participants make proposals positively and provide topics. Such a network is expanding to a woridwide scale.

  5. 3rd International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2014-01-01

    This volume compiles the major results of conference participants from the "Third International Conference in Network Analysis" held at the Higher School of Economics, Nizhny Novgorod in May 2013, with the aim to initiate further joint research among different groups. The contributions in this book cover a broad range of topics relevant to the theory and practice of network analysis, including the reliability of complex networks, software, theory, methodology, and applications.  Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network has brought together researchers, practitioners from numerous fields such as operations research, computer science, transportation, energy, biomedicine, computational neuroscience and social sciences. In addition, new approaches and computer environments such as parallel computing, grid computing, cloud computing, and quantum computing have helped to solve large scale...

  6. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  7. Computational Analysis of Molecular Interaction Networks Underlying Change of HIV-1 Resistance to Selected Reverse Transcriptase Inhibitors.

    Science.gov (United States)

    Kierczak, Marcin; Dramiński, Michał; Koronacki, Jacek; Komorowski, Jan

    2010-12-12

    Despite more than two decades of research, HIV resistance to drugs remains a serious obstacle in developing efficient AIDS treatments. Several computational methods have been developed to predict resistance level from the sequence of viral proteins such as reverse transcriptase (RT) or protease. These methods, while powerful and accurate, give very little insight into the molecular interactions that underly acquisition of drug resistance/hypersusceptibility. Here, we attempt at filling this gap by using our Monte Carlo feature selection and interdependency discovery method (MCFS-ID) to elucidate molecular interaction networks that characterize viral strains with altered drug resistance levels. We analyzed a number of HIV-1 RT sequences annotated with drug resistance level using the MCFS-ID method. This let us expound interdependency networks that characterize change of drug resistance to six selected RT inhibitors: Abacavir, Lamivudine, Stavudine, Zidovudine, Tenofovir and Nevirapine. The networks consider interdependencies at the level of physicochemical properties of mutating amino acids, eg,: polarity. We mapped each network on the 3D structure of RT in attempt to understand the molecular meaning of interacting pairs. The discovered interactions describe several known drug resistance mechanisms and, importantly, some previously unidentified ones. Our approach can be easily applied to a whole range of problems from the domain of protein engineering. A portable Java implementation of our MCFS-ID method is freely available for academic users and can be obtained at: http://www.ipipan.eu/staff/m.draminski/software.htm.

  8. Mobile Agents in Networking and Distributed Computing

    CERN Document Server

    Cao, Jiannong

    2012-01-01

    The book focuses on mobile agents, which are computer programs that can autonomously migrate between network sites. This text introduces the concepts and principles of mobile agents, provides an overview of mobile agent technology, and focuses on applications in networking and distributed computing.

  9. Automated classification of computer network attacks

    CSIR Research Space (South Africa)

    Van Heerden, R

    2013-11-01

    Full Text Available In this paper we demonstrate how an automated reasoner, HermiT, is used to classify instances of computer network based attacks in conjunction with a network attack ontology. The ontology describes different types of network attacks through classes...

  10. Thermodynamic analysis of computed pathways integrated into the metabolic networks of E. coli and Synechocystis reveals contrasting expansion potential.

    Science.gov (United States)

    Asplund-Samuelsson, Johannes; Janasch, Markus; Hudson, Elton P

    2017-12-23

    Introducing biosynthetic pathways into an organism is both reliant on and challenged by endogenous biochemistry. Here we compared the expansion potential of the metabolic network in the photoautotroph Synechocystis with that of the heterotroph E. coli using the novel workflow POPPY (Prospecting Optimal Pathways with PYthon). First, E. coli and Synechocystis metabolomic and fluxomic data were combined with metabolic models to identify thermodynamic constraints on metabolite concentrations (NET analysis). Then, thousands of automatically constructed pathways were placed within each network and subjected to a network-embedded variant of the max-min driving force analysis (NEM). We found that the networks had different capabilities for imparting thermodynamic driving forces toward certain compounds. Key metabolites were constrained differently in Synechocystis due to opposing flux directions in glycolysis and carbon fixation, the forked tri-carboxylic acid cycle, and photorespiration. Furthermore, the lysine biosynthesis pathway in Synechocystis was identified as thermodynamically constrained, impacting both endogenous and heterologous reactions through low 2-oxoglutarate levels. Our study also identified important yet poorly covered areas in existing metabolomics data and provides a reference for future thermodynamics-based engineering in Synechocystis and beyond. The POPPY methodology represents a step in making optimal pathway-host matches, which is likely to become important as the practical range of host organisms is diversified. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Network Management of the SPLICE Computer Network.

    Science.gov (United States)

    1982-12-01

    and the Lawrence Livermore Nttionl Laboratory Octopus lietwork [Ref. 24]. Additionally, the :oiex Distributed Network Coatrol Systems 200 and 330...Alexander A., litftqiifivl 93 24. University of Calif~cnia Lavr i ce LJ~vermoce Laboratory Letter Wloe Requa): to -aptN -1raq. Ope, maya & Postgraduaate

  12. Computation, cryptography, and network security

    CERN Document Server

    Rassias, Michael

    2015-01-01

    Analysis, assessment, and data management are core competencies for operation research analysts. This volume addresses a number of issues and developed methods for improving those skills. It is an outgrowth of a conference held in April 2013 at the Hellenic Military Academy, and brings together a broad variety of mathematical methods and theories with several applications. It discusses directions and pursuits of scientists that pertain to engineering sciences. It is also presents the theoretical background required for algorithms and techniques applied to a large variety of concrete problems. A number of open questions as well as new future areas are also highlighted.   This book will appeal to operations research analysts, engineers, community decision makers, academics, the military community, practitioners sharing the current “state-of-the-art,” and analysts from coalition partners. Topics covered include Operations Research, Games and Control Theory, Computational Number Theory and Information Securi...

  13. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han

    2016-07-11

    Connectivity and layout of underlying networks largely determine agent behavior and usage in many environments. For example, transportation networks determine the flow of traffic in a neighborhood, whereas building floorplans determine the flow of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications. Such specifications can be in the form of network density, travel time versus network length, traffic type, destination location, etc. We propose an integer programming-based approach that guarantees that the resultant networks are valid by fulfilling all the specified hard constraints and that they score favorably in terms of the objective function. We evaluate our algorithm in two different design settings, street layout and floorplans to demonstrate that diverse networks can emerge purely from high-level functional specifications.

  14. Reliable Interconnection Networks for Parallel Computers

    Science.gov (United States)

    1991-10-01

    AD-A259 498111IIIIIIII il1111 1 111 1 1 1 il i Technical Report 1294 R l a leliable Interconnection Networks for Parallel Computers ELECTE I S .JAN...SUBTITLE S. FUNDING NUMBERS Reliable Interconnection Networks for Parallel Computers N00014-80-C-0622 N00014-85-K-0124 N00014-91-J-1698 6. AUTHOR(S) Larry...are presented. 14. SUBJECT TERMS (key words) IS. NUMBER OF PAGES networks fault tolerance parallel computers 78 reliable routors 16. PRICE CODE

  15. Parallel computing and networking; Heiretsu keisanki to network

    Energy Technology Data Exchange (ETDEWEB)

    Asakawa, E.; Tsuru, T. [Japan National Oil Corp., Tokyo (Japan); Matsuoka, T. [Japan Petroleum Exploration Co. Ltd., Tokyo (Japan)

    1996-05-01

    This paper describes the trend of parallel computers used in geophysical exploration. Around 1993 was the early days when the parallel computers began to be used for geophysical exploration. Classification of these computers those days was mainly MIMD (multiple instruction stream, multiple data stream), SIMD (single instruction stream, multiple data stream) and the like. Parallel computers were publicized in the 1994 meeting of the Geophysical Exploration Society as a `high precision imaging technology`. Concerning the library of parallel computers, there was a shift to PVM (parallel virtual machine) in 1993 and to MPI (message passing interface) in 1995. In addition, the compiler of FORTRAN90 was released with support implemented for data parallel and vector computers. In 1993, networks used were Ethernet, FDDI, CDDI and HIPPI. In 1995, the OC-3 products under ATM began to propagate. However, ATM remains to be an interoffice high speed network because the ATM service has not spread yet for the public network. 1 ref.

  16. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  17. Gene identification for risk of relapse in stage I lung adenocarcinoma patients: a combined methodology of gene expression profiling and computational gene network analysis.

    Science.gov (United States)

    Ludovini, Vienna; Bianconi, Fortunato; Siggillino, Annamaria; Piobbico, Danilo; Vannucci, Jacopo; Metro, Giulio; Chiari, Rita; Bellezza, Guido; Puma, Francesco; Della Fazia, Maria Agnese; Servillo, Giuseppe; Crinò, Lucio

    2016-05-24

    Risk assessment and treatment choice remains a challenge in early non-small-cell lung cancer (NSCLC). The aim of this study was to identify novel genes involved in the risk of early relapse (ER) compared to no relapse (NR) in resected lung adenocarcinoma (AD) patients using a combination of high throughput technology and computational analysis. We identified 18 patients (n.13 NR and n.5 ER) with stage I AD. Frozen samples of patients in ER, NR and corresponding normal lung (NL) were subjected to Microarray technology and quantitative-PCR (Q-PCR). A gene network computational analysis was performed to select predictive genes. An independent set of 79 ADs stage I samples was used to validate selected genes by Q-PCR.From microarray analysis we selected 50 genes, using the fold change ratio of ER versus NR. They were validated both in pool and individually in patient samples (ER and NR) by Q-PCR. Fourteen increased and 25 decreased genes showed a concordance between two methods. They were used to perform a computational gene network analysis that identified 4 increased (HOXA10, CLCA2, AKR1B10, FABP3) and 6 decreased (SCGB1A1, PGC, TFF1, PSCA, SPRR1B and PRSS1) genes. Moreover, in an independent dataset of ADs samples, we showed that both high FABP3 expression and low SCGB1A1 expression was associated with a worse disease-free survival (DFS).Our results indicate that it is possible to define, through gene expression and computational analysis, a characteristic gene profiling of patients with an increased risk of relapse that may become a tool for patient selection for adjuvant therapy.

  18. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  19. Computer Network Equipment for Intrusion Detection Research

    National Research Council Canada - National Science Library

    Ye, Nong

    2000-01-01

    .... To test the process model, the system-level intrusion detection techniques and the working prototype of the intrusion detection system, a set of computer and network equipment has been purchased...

  20. Computational Complexity of Bosons in Linear Networks

    Science.gov (United States)

    2017-03-01

    AFRL-AFOSR-JP-TR-2017-0020 Computational complexity of bosons in linear networks Andrew White THE UNIVERSITY OF QUEENSLAND Final Report 07/27/2016...DATES COVERED (From - To) 02 Mar 2013 to 01 Mar 2016 4. TITLE AND SUBTITLE Computational complexity of bosons in linear networks 5a.  CONTRACT NUMBER 5b...direct exploration of the effect of partial distinguishability in the complexity class of the resulting sampling distribution. Our demultiplexed source

  1. Network analysis applications in hydrology

    Science.gov (United States)

    Price, Katie

    2017-04-01

    Applied network theory has seen pronounced expansion in recent years, in fields such as epidemiology, computer science, and sociology. Concurrent development of analytical methods and frameworks has increased possibilities and tools available to researchers seeking to apply network theory to a variety of problems. While water and nutrient fluxes through stream systems clearly demonstrate a directional network structure, the hydrological applications of network theory remain under­explored. This presentation covers a review of network applications in hydrology, followed by an overview of promising network analytical tools that potentially offer new insights into conceptual modeling of hydrologic systems, identifying behavioral transition zones in stream networks and thresholds of dynamical system response. Network applications were tested along an urbanization gradient in Atlanta, Georgia, USA. Peachtree Creek and Proctor Creek. Peachtree Creek contains a nest of five long­term USGS streamflow and water quality gages, allowing network application of long­term flow statistics. The watershed spans a range of suburban and heavily urbanized conditions. Summary flow statistics and water quality metrics were analyzed using a suite of network analysis techniques, to test the conceptual modeling and predictive potential of the methodologies. Storm events and low flow dynamics during Summer 2016 were analyzed using multiple network approaches, with an emphasis on tomogravity methods. Results indicate that network theory approaches offer novel perspectives for understanding long­ term and event­based hydrological data. Key future directions for network applications include 1) optimizing data collection, 2) identifying "hotspots" of contaminant and overland flow influx to stream systems, 3) defining process domains, and 4) analyzing dynamic connectivity of various system components, including groundwater­surface water interactions.

  2. Computer Networks and African Studies Centers.

    Science.gov (United States)

    Kuntz, Patricia S.

    The use of electronic communication in the 12 Title VI African Studies Centers is discussed, and the networks available for their use are reviewed. It is argued that the African Studies Centers should be on the cutting edge of contemporary electronic communication and that computer networks should be a fundamental aspect of their programs. An…

  3. A computer network attack taxonomy and ontology

    CSIR Research Space (South Africa)

    Van Heerden, RP

    2012-01-01

    Full Text Available of attacks, means that an attack could be mitigated accordingly. The authors extend a previous, initial taxonomy of computer network attacks which forms the basis of a proposed network attack ontology in this paper. The objective of this ontology...

  4. Virtual Network Computing Testbed for Cybersecurity Research

    Science.gov (United States)

    2015-08-17

    Standard Form 298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 212-346-1012 W911NF-12-1-0393 61504-CS-RIP.2 Final Report a. REPORT 14. ABSTRACT 16...Technology, 2007. [8] Pullen, J. M., 2000. The network workbench : network simulation software for academic investigation of Internet concepts. Comput

  5. EFFICIENCY METRICS COMPUTING IN COMBINED SENSOR NETWORKS

    OpenAIRE

    Luntovskyy, Andriy; Vasyutynskyy, Volodymyr

    2014-01-01

    This paper discusses the computer-aided design of combined networks for offices and building automation systems based on diverse wired and wireless standards. The design requirements for these networks are often contradictive and have to consider performance, energy and cost efficiency together. For usual office communication, quality of service is more important. In the wireless sensor networks, the energy efficiency is a critical requirement to ensure their long life, to reduce maintenance ...

  6. Predictive Control of Networked Multiagent Systems via Cloud Computing.

    Science.gov (United States)

    Liu, Guo-Ping

    2017-01-18

    This paper studies the design and analysis of networked multiagent predictive control systems via cloud computing. A cloud predictive control scheme for networked multiagent systems (NMASs) is proposed to achieve consensus and stability simultaneously and to compensate for network delays actively. The design of the cloud predictive controller for NMASs is detailed. The analysis of the cloud predictive control scheme gives the necessary and sufficient conditions of stability and consensus of closed-loop networked multiagent control systems. The proposed scheme is verified to characterize the dynamical behavior and control performance of NMASs through simulations. The outcome provides a foundation for the development of cooperative and coordinative control of NMASs and its applications.

  7. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  8. Transmission analysis in WDM networks

    DEFF Research Database (Denmark)

    Rasmussen, Christian Jørgen

    1999-01-01

    This thesis describes the development of a computer-based simulator for transmission analysis in optical wavelength division multiplexing networks. A great part of the work concerns fundamental optical network simulator issues. Among these issues are identification of the versatility and user......-friendliness demands which such a simulator must meet, development of the "spectral window representation" for representation of the optical signals and finding an effective way of handling the optical signals in the computer memory. One important issue more is the rules for the determination of the order in which...... the different component models are invoked during the simulation of a system. A simple set of rules which makes it possible to simulate any network architectures is laid down. The modelling of the nonlinear fibre and the optical receiver is also treated. The work on the fibre concerns the numerical solution...

  9. Autonomic computing enabled cooperative networked design

    CERN Document Server

    Wodczak, Michal

    2014-01-01

    This book introduces the concept of autonomic computing driven cooperative networked system design from an architectural perspective. As such it leverages and capitalises on the relevant advancements in both the realms of autonomic computing and networking by welding them closely together. In particular, a multi-faceted Autonomic Cooperative System Architectural Model is defined which incorporates the notion of Autonomic Cooperative Behaviour being orchestrated by the Autonomic Cooperative Networking Protocol of a cross-layer nature. The overall proposed solution not only advocates for the inc

  10. Chemical Reaction Networks for Computing Polynomials.

    Science.gov (United States)

    Salehi, Sayed Ahmad; Parhi, Keshab K; Riedel, Marc D

    2017-01-20

    Chemical reaction networks (CRNs) provide a fundamental model in the study of molecular systems. Widely used as formalism for the analysis of chemical and biochemical systems, CRNs have received renewed attention as a model for molecular computation. This paper demonstrates that, with a new encoding, CRNs can compute any set of polynomial functions subject only to the limitation that these functions must map the unit interval to itself. These polynomials can be expressed as linear combinations of Bernstein basis polynomials with positive coefficients less than or equal to 1. In the proposed encoding approach, each variable is represented using two molecular types: a type-0 and a type-1. The value is the ratio of the concentration of type-1 molecules to the sum of the concentrations of type-0 and type-1 molecules. The proposed encoding naturally exploits the expansion of a power-form polynomial into a Bernstein polynomial. Molecular encoders for converting any input in a standard representation to the fractional representation as well as decoders for converting the computed output from the fractional to a standard representation are presented. The method is illustrated first for generic CRNs; then chemical reactions designed for an example are mapped to DNA strand-displacement reactions.

  11. Spontaneous ad hoc mobile cloud computing network.

    Science.gov (United States)

    Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes

    2014-01-01

    Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.

  12. Spontaneous Ad Hoc Mobile Cloud Computing Network

    Directory of Open Access Journals (Sweden)

    Raquel Lacuesta

    2014-01-01

    Full Text Available Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.

  13. Computational Analysis of Residue Interaction Networks and Coevolutionary Relationships in the Hsp70 Chaperones: A Community-Hopping Model of Allosteric Regulation and Communication.

    Directory of Open Access Journals (Sweden)

    Gabrielle Stetz

    2017-01-01

    Full Text Available Allosteric interactions in the Hsp70 proteins are linked with their regulatory mechanisms and cellular functions. Despite significant progress in structural and functional characterization of the Hsp70 proteins fundamental questions concerning modularity of the allosteric interaction networks and hierarchy of signaling pathways in the Hsp70 chaperones remained largely unexplored and poorly understood. In this work, we proposed an integrated computational strategy that combined atomistic and coarse-grained simulations with coevolutionary analysis and network modeling of the residue interactions. A novel aspect of this work is the incorporation of dynamic residue correlations and coevolutionary residue dependencies in the construction of allosteric interaction networks and signaling pathways. We found that functional sites involved in allosteric regulation of Hsp70 may be characterized by structural stability, proximity to global hinge centers and local structural environment that is enriched by highly coevolving flexible residues. These specific characteristics may be necessary for regulation of allosteric structural transitions and could distinguish regulatory sites from nonfunctional conserved residues. The observed confluence of dynamics correlations and coevolutionary residue couplings with global networking features may determine modular organization of allosteric interactions and dictate localization of key mediating sites. Community analysis of the residue interaction networks revealed that concerted rearrangements of local interacting modules at the inter-domain interface may be responsible for global structural changes and a population shift in the DnaK chaperone. The inter-domain communities in the Hsp70 structures harbor the majority of regulatory residues involved in allosteric signaling, suggesting that these sites could be integral to the network organization and coordination of structural changes. Using a network-based formalism of

  14. Computational Analysis of Residue Interaction Networks and Coevolutionary Relationships in the Hsp70 Chaperones: A Community-Hopping Model of Allosteric Regulation and Communication.

    Science.gov (United States)

    Stetz, Gabrielle; Verkhivker, Gennady M

    2017-01-01

    Allosteric interactions in the Hsp70 proteins are linked with their regulatory mechanisms and cellular functions. Despite significant progress in structural and functional characterization of the Hsp70 proteins fundamental questions concerning modularity of the allosteric interaction networks and hierarchy of signaling pathways in the Hsp70 chaperones remained largely unexplored and poorly understood. In this work, we proposed an integrated computational strategy that combined atomistic and coarse-grained simulations with coevolutionary analysis and network modeling of the residue interactions. A novel aspect of this work is the incorporation of dynamic residue correlations and coevolutionary residue dependencies in the construction of allosteric interaction networks and signaling pathways. We found that functional sites involved in allosteric regulation of Hsp70 may be characterized by structural stability, proximity to global hinge centers and local structural environment that is enriched by highly coevolving flexible residues. These specific characteristics may be necessary for regulation of allosteric structural transitions and could distinguish regulatory sites from nonfunctional conserved residues. The observed confluence of dynamics correlations and coevolutionary residue couplings with global networking features may determine modular organization of allosteric interactions and dictate localization of key mediating sites. Community analysis of the residue interaction networks revealed that concerted rearrangements of local interacting modules at the inter-domain interface may be responsible for global structural changes and a population shift in the DnaK chaperone. The inter-domain communities in the Hsp70 structures harbor the majority of regulatory residues involved in allosteric signaling, suggesting that these sites could be integral to the network organization and coordination of structural changes. Using a network-based formalism of allostery, we

  15. Algorithms and networking for computer games

    CERN Document Server

    Smed, Jouni

    2006-01-01

    Algorithms and Networking for Computer Games is an essential guide to solving the algorithmic and networking problems of modern commercial computer games, written from the perspective of a computer scientist. Combining algorithmic knowledge and game-related problems, the authors discuss all the common difficulties encountered in game programming. The first part of the book tackles algorithmic problems by presenting how they can be solved practically. As well as ""classical"" topics such as random numbers, tournaments and game trees, the authors focus on how to find a path in, create the terrai

  16. Computer network time synchronization the network time protocol

    CERN Document Server

    Mills, David L

    2006-01-01

    What started with the sundial has, thus far, been refined to a level of precision based on atomic resonance: Time. Our obsession with time is evident in this continued scaling down to nanosecond resolution and beyond. But this obsession is not without warrant. Precision and time synchronization are critical in many applications, such as air traffic control and stock trading, and pose complex and important challenges in modern information networks.Penned by David L. Mills, the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol

  17. Professional networking using computer-mediated communication.

    Science.gov (United States)

    Washer, Peter

    Traditionally, professionals have networked with others in their field through attending conferences, professional organizations, direct mailing, and via the workplace. Recently, there have been new possibilities to network with other professionals using the internet. This article looks at the possibilities that the internet offers for professional networking, particularly e-mailing lists, newsgroups and membership databases, and compares them against more traditional methods of professional networking. The different types of computer-mediated communication are discussed and their relative merits and disadvantages are examined. The benefits and potential pitfalls of internet professional networking, as it relates to the nursing profession, are examined. Practical advice is offered on how the internet can be used as a means to foster professional networks of academic, clinical or research interests.

  18. Unraveling protein networks with power graph analysis.

    Science.gov (United States)

    Royer, Loïc; Reimann, Matthias; Andreopoulos, Bill; Schroeder, Michael

    2008-07-11

    Networks play a crucial role in computational biology, yet their analysis and representation is still an open problem. Power Graph Analysis is a lossless transformation of biological networks into a compact, less redundant representation, exploiting the abundance of cliques and bicliques as elementary topological motifs. We demonstrate with five examples the advantages of Power Graph Analysis. Investigating protein-protein interaction networks, we show how the catalytic subunits of the casein kinase II complex are distinguishable from the regulatory subunits, how interaction profiles and sequence phylogeny of SH3 domains correlate, and how false positive interactions among high-throughput interactions are spotted. Additionally, we demonstrate the generality of Power Graph Analysis by applying it to two other types of networks. We show how power graphs induce a clustering of both transcription factors and target genes in bipartite transcription networks, and how the erosion of a phosphatase domain in type 22 non-receptor tyrosine phosphatases is detected. We apply Power Graph Analysis to high-throughput protein interaction networks and show that up to 85% (56% on average) of the information is redundant. Experimental networks are more compressible than rewired ones of same degree distribution, indicating that experimental networks are rich in cliques and bicliques. Power Graphs are a novel representation of networks, which reduces network complexity by explicitly representing re-occurring network motifs. Power Graphs compress up to 85% of the edges in protein interaction networks and are applicable to all types of networks such as protein interactions, regulatory networks, or homology networks.

  19. Computational analysis of HIV-1 resistance based on gene expression profiles and the virus-host interaction network.

    Directory of Open Access Journals (Sweden)

    Tao Huang

    Full Text Available A very small proportion of people remain negative for HIV infection after repeated HIV-1 viral exposure, which is called HIV-1 resistance. Understanding the mechanism of HIV-1 resistance is important for the development of HIV-1 vaccines and Acquired Immune Deficiency Syndrome (AIDS therapies. In this study, we analyzed the gene expression profiles of CD4+ T cells from HIV-1-resistant individuals and HIV-susceptible individuals. One hundred eighty-five discriminative HIV-1 resistance genes were identified using the Minimum Redundancy-Maximum Relevance (mRMR and Incremental Feature Selection (IFS methods. The virus protein target enrichment analysis of the 185 HIV-1 resistance genes suggested that the HIV-1 protein nef might play an important role in HIV-1 infection. Moreover, we identified 29 infection information exchanger genes from the 185 HIV-1 resistance genes based on a virus-host interaction network analysis. The infection information exchanger genes are located on the shortest paths between virus-targeted proteins and are important for the coordination of virus infection. These proteins may be useful targets for AIDS prevention or therapy, as intervention in these pathways could disrupt communication with virus-targeted proteins and HIV-1 infection.

  20. Natural computing for vehicular networks

    OpenAIRE

    Toutouh El Alamin, Jamal

    2016-01-01

    La presente tesis aborda el diseño inteligente de soluciones para el despliegue de redes vehiculares ad-hoc (vehicular ad hoc networks, VANETs). Estas son redes de comunicación inalámbrica formada principalmente por vehículos y elementos de infraestructura vial. Las VANETs ofrecen la oportunidad para desarrollar aplicaciones revolucionarias en el ámbito de la seguridad y eficiencia vial. Al ser un dominio tan novedoso, existe una serie de cuestiones abiertas, como el diseño de la infraestruct...

  1. Measuring the Dynamics of Climate Change Communication in Mass Media and Social Networks with Computer-Assisted Content Analysis

    Science.gov (United States)

    Kirilenko, A.; Stepchenkova, S.

    2012-12-01

    To date, multiple authors have examined media representations of and public attitudes towards climate change, as well as how these representations and attitudes differ from scientific knowledge on the issue of climate change. Content analysis of newspaper publications, TV news, and, recently, Internet blogs has allowed for identification of major discussion themes within the climate change domain (e.g., newspaper trends, comparison of climate change discourse in different countries, contrasting liberal vs. conservative press). The majority of these studies, however, have processed texts manually, limiting textual population size, restricting the analysis to a relatively small number of themes, and using time-expensive coding procedures. The use of computer-assisted text analysis (CATA) software is important because the difficulties with manual processing become more severe with an increased volume of data. We developed a CATA approach that allows a large body of text materials to be surveyed in a quantifiable, objective, transparent, and time-efficient manner. While staying within the quantitative tradition of content analysis, the approach allows for an interpretation of the public discourse closer to one of more qualitatively oriented methods. The methodology used in this study contains several steps: (1) sample selection; (2) data preparation for computer processing and obtaining a matrix of keyword frequencies; (3) identification of themes in the texts using Exploratory Factor Analysis (EFA); (4) combining identified themes into higher order themes using Confirmatory Factor Analysis (CFA); (5) interpretation of obtained public discourse themes using factor scores; and (6) tracking the development of the main themes of the climate change discourse through time. In the report, we concentrate on two examples of CATA applied to study public perception of climate change. First example is an analysis of temporal change in public discourse on climate change. Applying

  2. Computing chemical organizations in biological networks.

    Science.gov (United States)

    Centler, Florian; Kaleta, Christoph; di Fenizio, Pietro Speroni; Dittrich, Peter

    2008-07-15

    Novel techniques are required to analyze computational models of intracellular processes as they increase steadily in size and complexity. The theory of chemical organizations has recently been introduced as such a technique that links the topology of biochemical reaction network models to their dynamical repertoire. The network is decomposed into algebraically closed and self-maintaining subnetworks called organizations. They form a hierarchy representing all feasible system states including all steady states. We present three algorithms to compute the hierarchy of organizations for network models provided in SBML format. Two of them compute the complete organization hierarchy, while the third one uses heuristics to obtain a subset of all organizations for large models. While the constructive approach computes the hierarchy starting from the smallest organization in a bottom-up fashion, the flux-based approach employs self-maintaining flux distributions to determine organizations. A runtime comparison on 16 different network models of natural systems showed that none of the two exhaustive algorithms is superior in all cases. Studying a 'genome-scale' network model with 762 species and 1193 reactions, we demonstrate how the organization hierarchy helps to uncover the model structure and allows to evaluate the model's quality, for example by detecting components and subsystems of the model whose maintenance is not explained by the model. All data and a Java implementation that plugs into the Systems Biology Workbench is available from http://www.minet.uni-jena.de/csb/prj/ot/tools.

  3. International Symposium on Computing and Network Sustainability

    CERN Document Server

    Akashe, Shyam

    2017-01-01

    The book is compilation of technical papers presented at International Research Symposium on Computing and Network Sustainability (IRSCNS 2016) held in Goa, India on 1st and 2nd July 2016. The areas covered in the book are sustainable computing and security, sustainable systems and technologies, sustainable methodologies and applications, sustainable networks applications and solutions, user-centered services and systems and mobile data management. The novel and recent technologies presented in the book are going to be helpful for researchers and industries in their advanced works.

  4. On computer vision in wireless sensor networks.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Ko, Teresa H.

    2004-09-01

    Wireless sensor networks allow detailed sensing of otherwise unknown and inaccessible environments. While it would be beneficial to include cameras in a wireless sensor network because images are so rich in information, the power cost of transmitting an image across the wireless network can dramatically shorten the lifespan of the sensor nodes. This paper describe a new paradigm for the incorporation of imaging into wireless networks. Rather than focusing on transmitting images across the network, we show how an image can be processed locally for key features using simple detectors. Contrasted with traditional event detection systems that trigger an image capture, this enables a new class of sensors which uses a low power imaging sensor to detect a variety of visual cues. Sharing these features among relevant nodes cues specific actions to better provide information about the environment. We report on various existing techniques developed for traditional computer vision research which can aid in this work.

  5. Planning and management of cloud computing networks

    Science.gov (United States)

    Larumbe, Federico

    comprehensive vision. The first question to be solved is what are the optimal data center locations. We found that the location of each data center has a big impact on cost, QoS, power consumption, and greenhouse gas emissions. An optimization problem with a multi-criteria objective function is proposed to decide jointly the optimal location of data centers and software components, link capacities, and information routing. Once the network planning has been analyzed, the problem of dynamic resource provisioning in real time is addressed. In this context, virtualization is a key technique in cloud computing because each server can be shared by multiple Virtual Machines (VMs) and the total power consumption can be reduced. In the same line of location problems, we propose a Green Cloud Broker that optimizes VM placement across multiple data centers. In fact, when multiple data centers are considered, response time can be reduced by placing VMs close to users, cost can be minimized, power consumption can be optimized by using energy efficient data centers, and CO2 emissions can be decreased by choosing data centers provided with renewable energy sources. The third stage of the analysis is the short-term management of a cloud data center. In particular, a method is proposed to assign VMs to servers by considering communication traffic among VMs. Cloud data centers receive new applications over time and these applications need on-demand resource provisioning. Each application is composed of multiple types of VMs that interact among themselves. A program called scheduler must place each new VM in a server and that impacts the QoS and power consumption. Our method places VMs that communicate among themselves in servers that are close to each other in the network topology, thus reducing communication delay and increasing the throughput available among VMs. Furthermore, the power consumption of each type of server is considered and the most efficient ones are chosen to place the VMs

  6. Computational Music Analysis

    DEFF Research Database (Denmark)

    in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern recognition, machine learning, topology, algebra and signal processing. Many of the methods described draw...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  7. Signaling networks: information flow, computation, and decision making.

    Science.gov (United States)

    Azeloglu, Evren U; Iyengar, Ravi

    2015-04-01

    Signaling pathways come together to form networks that connect receptors to many different cellular machines. Such networks not only receive and transmit signals but also process information. The complexity of these networks requires the use of computational models to understand how information is processed and how input-output relationships are determined. Two major computational approaches used to study signaling networks are graph theory and dynamical modeling. Both approaches are useful; network analysis (application of graph theory) helps us understand how the signaling network is organized and what its information-processing capabilities are, whereas dynamical modeling helps us determine how the system changes in time and space upon receiving stimuli. Computational models have helped us identify a number of emergent properties that signaling networks possess. Such properties include ultrasensitivity, bistability, robustness, and noise-filtering capabilities. These properties endow cell-signaling networks with the ability to ignore small or transient signals and/or amplify signals to drive cellular machines that spawn numerous physiological functions associated with different cell states. Copyright © 2015 Cold Spring Harbor Laboratory Press; all rights reserved.

  8. Student Motivation in Computer Networking Courses

    Directory of Open Access Journals (Sweden)

    Wen-Jung Hsin

    2007-01-01

    Full Text Available This paper introduces several hands-on projects that have been used to motivate students in learning various computer networking concepts. These projects are shown to be very useful and applicable to the learners’ daily tasks and activities such as emailing, Web browsing, and online shopping and banking, and lead to an unexpected byproduct, self-motivation.

  9. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  10. Student Motivation in Computer Networking Courses

    Directory of Open Access Journals (Sweden)

    Wen-Jung Hsin, PhD

    2007-08-01

    Full Text Available This paper introduces several hands-on projects that have been used to motivate students in learning various computer networking concepts. These projects are shown to be very useful and applicable to the learners’ daily tasks and activities such as emailing, Web browsing, and online shopping and banking, and lead to an unexpected byproduct, self-motivation.

  11. Dynamics of Bottlebrush Networks: A Computational Study

    Science.gov (United States)

    Dobrynin, Andrey; Cao, Zhen; Sheiko, Sergei

    We study dynamics of deformation of bottlebrush networks using molecular dynamics simulations and theoretical calculations. Analysis of our simulation results show that the dynamics of bottlebrush network deformation can be described by a Rouse model for polydisperse networks with effective Rouse time of the bottlebrush network strand, τR =τ0Ns2 (Nsc + 1) where, Ns is the number-average degree of polymerization of the bottlebrush backbone strands between crosslinks, Nsc is the degree of polymerization of the side chains and τ0is a characteristic monomeric relaxation time. At time scales t smaller than the Rouse time, t crosslinks, the network response is pure elastic with shear modulus G (t) =G0 , where G0 is the equilibrium shear modulus at small deformation. The stress evolution in the bottlebrush networks can be described by a universal function of t /τR . NSF DMR-1409710.

  12. Scaling in Computer Network Traffic

    Science.gov (United States)

    2005-01-07

    behaviour, measurement .... 29 The Self-Similar Traffic Model Fractional Gaussian Noise (fGn) and Fractional Brownian Motion (fBm) The unique...AVAILABILITY STATEMENT Approved for public release, distribution unlimited 13. SUPPLEMENTARY NOTES See also ADM001750, Wavelets and Multifractal Analysis... Multifractality Wavelet qth order moments: IE|dX(j, k)|q ∼ C 2αqj, j → 0. Estimating the LHS from data using Sq(j) = 1 nj ∑ k |dX(j, k)|q, and measure the slopes

  13. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  14. Non-harmful insertion of data mimicking computer network attacks

    Energy Technology Data Exchange (ETDEWEB)

    Neil, Joshua Charles; Kent, Alexander; Hash, Jr, Curtis Lee

    2016-06-21

    Non-harmful data mimicking computer network attacks may be inserted in a computer network. Anomalous real network connections may be generated between a plurality of computing systems in the network. Data mimicking an attack may also be generated. The generated data may be transmitted between the plurality of computing systems using the real network connections and measured to determine whether an attack is detected.

  15. [Renewal of NIHS computer network system].

    Science.gov (United States)

    Segawa, Katsunori; Nakano, Tatsuya; Saito, Yoshiro

    2012-01-01

    Updated version of National Institute of Health Sciences Computer Network System (NIHS-NET) is described. In order to reduce its electric power consumption, the main server system was newly built using the virtual machine technology. The service that each machine provided in the previous network system should be maintained as much as possible. Thus, the individual server was constructed for each service, because a virtual server often show decrement in its performance as compared with a physical server. As a result, though the number of virtual servers was increased and the network communication became complicated among the servers, the conventional service was able to be maintained, and security level was able to be rather improved, along with saving electrical powers. The updated NIHS-NET bears multiple security countermeasures. To maximal use of these measures, awareness for the network security by all users is expected.

  16. Biological networks 101: computational modeling for molecular biologists

    NARCIS (Netherlands)

    Scholma, Jetse; Schivo, Stefano; Urquidi Camacho, Ricardo A.; van de Pol, Jan Cornelis; Karperien, Hermanus Bernardus Johannes; Post, Janine Nicole

    2014-01-01

    Computational modeling of biological networks permits the comprehensive analysis of cells and tissues to define molecular phenotypes and novel hypotheses. Although a large number of software tools have been developed, the versatility of these tools is limited by mathematical complexities that

  17. Social network analysis

    NARCIS (Netherlands)

    de Nooy, W.; Crothers, C.

    2009-01-01

    Social network analysis (SNA) focuses on the structure of ties within a set of social actors, e.g., persons, groups, organizations, and nations, or the products of human activity or cognition such as web sites, semantic concepts, and so on. It is linked to structuralism in sociology stressing the

  18. Network systems security analysis

    Science.gov (United States)

    Yilmaz, Ä.°smail

    2015-05-01

    Network Systems Security Analysis has utmost importance in today's world. Many companies, like banks which give priority to data management, test their own data security systems with "Penetration Tests" by time to time. In this context, companies must also test their own network/server systems and take precautions, as the data security draws attention. Based on this idea, the study cyber-attacks are researched throughoutly and Penetration Test technics are examined. With these information on, classification is made for the cyber-attacks and later network systems' security is tested systematically. After the testing period, all data is reported and filed for future reference. Consequently, it is found out that human beings are the weakest circle of the chain and simple mistakes may unintentionally cause huge problems. Thus, it is clear that some precautions must be taken to avoid such threats like updating the security software.

  19. Fuzzy logic, neural networks, and soft computing

    Science.gov (United States)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial

  20. Spiking network simulation code for petascale computers

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  1. Spiking network simulation code for petascale computers.

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  2. International Symposium on Complex Computing-Networks

    CERN Document Server

    Sevgi, L; CCN2005; Complex computing networks: Brain-like and wave-oriented electrodynamic algorithms

    2006-01-01

    This book uniquely combines new advances in the electromagnetic and the circuits&systems theory. It integrates both fields regarding computational aspects of common interest. Emphasized subjects are those methods which mimic brain-like and electrodynamic behaviour; among these are cellular neural networks, chaos and chaotic dynamics, attractor-based computation and stream ciphers. The book contains carefully selected contributions from the Symposium CCN2005. Pictures from the bestowal of Honorary Doctorate degrees to Leon O. Chua and Leopold B. Felsen are included.

  3. Fast network centrality analysis using GPUs

    Directory of Open Access Journals (Sweden)

    Shi Zhiao

    2011-05-01

    Full Text Available Abstract Background With the exploding volume of data generated by continuously evolving high-throughput technologies, biological network analysis problems are growing larger in scale and craving for more computational power. General Purpose computation on Graphics Processing Units (GPGPU provides a cost-effective technology for the study of large-scale biological networks. Designing algorithms that maximize data parallelism is the key in leveraging the power of GPUs. Results We proposed an efficient data parallel formulation of the All-Pairs Shortest Path problem, which is the key component for shortest path-based centrality computation. A betweenness centrality algorithm built upon this formulation was developed and benchmarked against the most recent GPU-based algorithm. Speedup between 11 to 19% was observed in various simulated scale-free networks. We further designed three algorithms based on this core component to compute closeness centrality, eccentricity centrality and stress centrality. To make all these algorithms available to the research community, we developed a software package gpu-fan (GPU-based Fast Analysis of Networks for CUDA enabled GPUs. Speedup of 10-50× compared with CPU implementations was observed for simulated scale-free networks and real world biological networks. Conclusions gpu-fan provides a significant performance improvement for centrality computation in large-scale networks. Source code is available under the GNU Public License (GPL at http://bioinfo.vanderbilt.edu/gpu-fan/.

  4. 4th International Conference in Network Analysis

    CERN Document Server

    Koldanov, Petr; Pardalos, Panos

    2016-01-01

    The contributions in this volume cover a broad range of topics including maximum cliques, graph coloring, data mining, brain networks, Steiner forest, logistic and supply chain networks. Network algorithms and their applications to market graphs, manufacturing problems, internet networks and social networks are highlighted. The "Fourth International Conference in Network Analysis," held at the Higher School of Economics, Nizhny Novgorod in May 2014, initiated joint research between scientists, engineers and researchers from academia, industry and government; the major results of conference participants have been reviewed and collected in this Work. Researchers and students in mathematics, economics, statistics, computer science and engineering will find this collection a valuable resource filled with the latest research in network analysis.

  5. Analysis of breast CT lesions using computer-aided diagnosis: an application of neural networks on extracted morphologic and texture features

    Science.gov (United States)

    Ray, Shonket; Prionas, Nicolas D.; Lindfors, Karen K.; Boone, John M.

    2012-03-01

    Dedicated cone-beam breast CT (bCT) scanners have been developed as a potential alternative imaging modality to conventional X-ray mammography in breast cancer diagnosis. As with other modalities, quantitative imaging (QI) analysis can potentially be utilized as a tool to extract useful numeric information concerning diagnosed lesions from high quality 3D tomographic data sets. In this work, preliminary QI analysis was done by designing and implementing a computer-aided diagnosis (CADx) system consisting of image preprocessing, object(s) of interest (i.e. masses, microcalcifications) segmentation, structural analysis of the segmented object(s), and finally classification into benign or malignant disease. Image sets were acquired from bCT patient scans with diagnosed lesions. Iterative watershed segmentation (IWS), a hybridization of the watershed method using observer-set markers and a gradient vector flow (GVF) approach, was used as the lesion segmentation method in 3D. Eight morphologic parameters and six texture features based on gray level co-occurrence matrix (GLCM) calculations were obtained per segmented lesion and combined into multi-dimensional feature input data vectors. Artificial neural network (ANN) classifiers were used by performing cross validation and network parameter optimization to maximize area under the curve (AUC) values of the resulting receiver-operating characteristic (ROC) curves. Within these ANNs, biopsy-proven diagnoses of malignant and benign lesions were recorded as target data while the feature vectors were saved as raw input data. With the image data separated into post-contrast (n = 55) and pre-contrast sets (n = 39), a maximum AUC of 0.70 +/- 0.02 and 0.80 +/- 0.02 were achieved, respectively, for each data set after ANN application.

  6. Fast computation of minimum hybridization networks.

    Science.gov (United States)

    Albrecht, Benjamin; Scornavacca, Celine; Cenci, Alberto; Huson, Daniel H

    2012-01-15

    Hybridization events in evolution may lead to incongruent gene trees. One approach to determining possible interspecific hybridization events is to compute a hybridization network that attempts to reconcile incongruent gene trees using a minimum number of hybridization events. We describe how to compute a representative set of minimum hybridization networks for two given bifurcating input trees, using a parallel algorithm and provide a user-friendly implementation. A simulation study suggests that our program performs significantly better than existing software on biologically relevant data. Finally, we demonstrate the application of such methods in the context of the evolution of the Aegilops/Triticum genera. The algorithm is implemented in the program Dendroscope 3, which is freely available from www.dendroscope.org and runs on all three major operating systems.

  7. Integrating Wireless Sensor Networks with Computational Grids

    Science.gov (United States)

    Preve, Nikolaos

    Wireless sensor networks (WSNs) have been greatly developed and emerged their significance in a wide range of important applications such as ac quisition and process in formation from the physical world. The evolvement of Grid computing has been based on coordination of distributed and shared re sources. A Sensor Grid network can integrate these two leading technologies enabling real-time sensor data collection, the sharing of computational and stor age grid resources for sensor data processing and management. Several issues have occurred from this integration which dispute the modern design of sensor grids. In order to address these issues, in this paper we propose a sensor grid ar chitecture supporting it by a testbed which focuses on the design issues and on the improvement of our sensor grid architecture design.

  8. Structural Analysis of Complex Networks

    CERN Document Server

    Dehmer, Matthias

    2011-01-01

    Filling a gap in literature, this self-contained book presents theoretical and application-oriented results that allow for a structural exploration of complex networks. The work focuses not only on classical graph-theoretic methods, but also demonstrates the usefulness of structural graph theory as a tool for solving interdisciplinary problems. Applications to biology, chemistry, linguistics, and data analysis are emphasized. The book is suitable for a broad, interdisciplinary readership of researchers, practitioners, and graduate students in discrete mathematics, statistics, computer science,

  9. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Device Status Data

    Science.gov (United States)

    2015-09-01

    Configuration 3 4.1 Status Codes 4 4.2 Request Time 5 4.3 Hydra BLOb Metadata 6 5. Data Processing 6 5.1 Hydra Data Processing Framework 6 5.1.1...statements on any database table or select the entire table at a user-specified interval. Hydra records its data in binary large object ( BLOb ...files.5 The data are organized into cuts inside the BLOb files. Typically each cut represents one poll on the network. 4. Hydra Configuration Hydra

  10. Computational modeling of signal transduction networks: a pedagogical exposition.

    Science.gov (United States)

    Prasad, Ashok

    2012-01-01

    We give a pedagogical introduction to computational modeling of signal transduction networks, starting from explaining the representations of chemical reactions by differential equations via the law of mass action. We discuss elementary biochemical reactions such as Michaelis-Menten enzyme kinetics and cooperative binding, and show how these allow the representation of large networks as systems of differential equations. We discuss the importance of looking for simpler or reduced models, such as network motifs or dynamical motifs within the larger network, and describe methods to obtain qualitative behavior by bifurcation analysis, using freely available continuation software. We then discuss stochastic kinetics and show how to implement easy-to-use methods of rule-based modeling for stochastic simulations. We finally suggest some methods for comprehensive parameter sensitivity analysis, and discuss the insights that it could yield. Examples, including code to try out, are provided based on a paper that modeled Ras kinetics in thymocytes.

  11. A computational study of routing algorithms for realistic transportation networks

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, R.; Marathe, M.V.; Nagel, K.

    1998-12-01

    The authors carry out an experimental analysis of a number of shortest path (routing) algorithms investigated in the context of the TRANSIMS (Transportation Analysis and Simulation System) project. The main focus of the paper is to study how various heuristic and exact solutions, associated data structures affected the computational performance of the software developed especially for realistic transportation networks. For this purpose the authors have used Dallas Fort-Worth road network with very high degree of resolution. The following general results are obtained: (1) they discuss and experimentally analyze various one-one shortest path algorithms, which include classical exact algorithms studied in the literature as well as heuristic solutions that are designed to take into account the geometric structure of the input instances; (2) they describe a number of extensions to the basic shortest path algorithm. These extensions were primarily motivated by practical problems arising in TRANSIMS and ITS (Intelligent Transportation Systems) related technologies. Extensions discussed include--(i) time dependent networks, (ii) multi-modal networks, (iii) networks with public transportation and associated schedules. Computational results are provided to empirically compare the efficiency of various algorithms. The studies indicate that a modified Dijkstra`s algorithm is computationally fast and an excellent candidate for use in various transportation planning applications as well as ITS related technologies.

  12. Computer network defense through radial wave functions

    Science.gov (United States)

    Malloy, Ian J.

    The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.

  13. The research of computer network security and protection strategy

    Science.gov (United States)

    He, Jian

    2017-05-01

    With the widespread popularity of computer network applications, its security is also received a high degree of attention. Factors affecting the safety of network is complex, for to do a good job of network security is a systematic work, has the high challenge. For safety and reliability problems of computer network system, this paper combined with practical work experience, from the threat of network security, security technology, network some Suggestions and measures for the system design principle, in order to make the masses of users in computer networks to enhance safety awareness and master certain network security technology.

  14. Using satellite communications for a mobile computer network

    Science.gov (United States)

    Wyman, Douglas J.

    1993-01-01

    The topics discussed include the following: patrol car automation, mobile computer network, network requirements, network design overview, MCN mobile network software, MCN hub operation, mobile satellite software, hub satellite software, the benefits of patrol car automation, the benefits of satellite mobile computing, and national law enforcement satellite.

  15. Design and implementation of a local computer network

    Energy Technology Data Exchange (ETDEWEB)

    Fortune, P. J.; Lidinsky, W. P.; Zelle, B. R.

    1977-01-01

    An intralaboratory computer communications network was designed and is being implemented at Argonne National Laboratory. Parameters which were considered to be important in the network design are discussed; and the network, including its hardware and software components, is described. A discussion of the relationship between computer networks and distributed processing systems is also presented. The problems which the network is designed to solve and the consequent network structure represent considerations which are of general interest. 5 figures.

  16. Network meta-analysis, electrical networks and graph theory.

    Science.gov (United States)

    Rücker, Gerta

    2012-12-01

    Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Water Pipeline Network Analysis Using Simultaneous Loop Flow ...

    African Journals Online (AJOL)

    2013-03-01

    Mar 1, 2013 ... solving for the unknown in water network analysis. It is based on a loop iterative computation. Newton-Raphson method is a better technique for solving the network problems; however, the method adopted here computes simultaneous flow corrections for all loops, hence, the best since the computational.

  18. Small-world networks in neuronal populations: a computational perspective.

    Science.gov (United States)

    Zippo, Antonio G; Gelsomino, Giuliana; Van Duin, Pieter; Nencini, Sara; Caramenti, Gian Carlo; Valente, Maurizio; Biella, Gabriele E M

    2013-08-01

    The analysis of the brain in terms of integrated neural networks may offer insights on the reciprocal relation between structure and information processing. Even with inherent technical limits, many studies acknowledge neuron spatial arrangements and communication modes as key factors. In this perspective, we investigated the functional organization of neuronal networks by explicitly assuming a specific functional topology, the small-world network. We developed two different computational approaches. Firstly, we asked whether neuronal populations actually express small-world properties during a definite task, such as a learning task. For this purpose we developed the Inductive Conceptual Network (ICN), which is a hierarchical bio-inspired spiking network, capable of learning invariant patterns by using variable-order Markov models implemented in its nodes. As a result, we actually observed small-world topologies during learning in the ICN. Speculating that the expression of small-world networks is not solely related to learning tasks, we then built a de facto network assuming that the information processing in the brain may occur through functional small-world topologies. In this de facto network, synchronous spikes reflected functional small-world network dependencies. In order to verify the consistency of the assumption, we tested the null-hypothesis by replacing the small-world networks with random networks. As a result, only small world networks exhibited functional biomimetic characteristics such as timing and rate codes, conventional coding strategies and neuronal avalanches, which are cascades of bursting activities with a power-law distribution. Our results suggest that small-world functional configurations are liable to underpin brain information processing at neuronal level. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Propagation models for computing biochemical reaction networks

    OpenAIRE

    Henzinger, Thomas A; Mateescu, Maria

    2011-01-01

    We introduce propagation models, a formalism designed to support general and efficient data structures for the transient analysis of biochemical reaction networks. We give two use cases for propagation abstract data types: the uniformization method and numerical integration. We also sketch an implementation of a propagation abstract data type, which uses abstraction to approximate states.

  20. Functional stoichiometric analysis of metabolic networks.

    Science.gov (United States)

    Urbanczik, R; Wagner, C

    2005-11-15

    An important tool in Systems Biology is the stoichiometric modeling of metabolic networks, where the stationary states of the network are described by a high-dimensional polyhedral cone, the so-called flux cone. Exhaustive descriptions of the metabolism can be obtained by computing the elementary vectors of this cone but, owing to a combinatorial explosion of the number of elementary vectors, this approach becomes computationally intractable for genome scale networks. Hence, we propose to instead focus on the conversion cone, a projection of the flux cone, which describes the interaction of the metabolism with its external chemical environment. We present a direct method for calculating the elementary vectors of this cone and, by studying the metabolism of Saccharomyces cerevisiae, we demonstrate that such an analysis is computationally feasible even for genome scale networks.

  1. Connect the dot: Computing feed-links for network extension

    Directory of Open Access Journals (Sweden)

    Boris Aronov

    2011-12-01

    Full Text Available Road network analysis can require distance from points that are not on the network themselves. We study the algorithmic problem of connecting a point inside a face (region of the road network to its boundary while minimizing the detour factor of that point to any point on the boundary of the face. We show that the optimal single connection (feed-link can be computed in O(lambda_7(n log n time, where n is the number of vertices that bounds the face and lambda_7(n is the slightly superlinear maximum length of a Davenport-Schinzel sequence of order 7 on n symbols. We also present approximation results for placing more feed-links, deal with the case that there are obstacles in the face of the road network that contains the point to be connected, and present various related results.

  2. Computational capabilities of graph neural networks.

    Science.gov (United States)

    Scarselli, Franco; Gori, Marco; Tsoi, Ah Chung; Hagenbuchner, Markus; Monfardini, Gabriele

    2009-01-01

    In this paper, we will consider the approximation properties of a recently introduced neural network model called graph neural network (GNN), which can be used to process-structured data inputs, e.g., acyclic graphs, cyclic graphs, and directed or undirected graphs. This class of neural networks implements a function tau(G,n) is an element of IR(m) that maps a graph G and one of its nodes n onto an m-dimensional Euclidean space. We characterize the functions that can be approximated by GNNs, in probability, up to any prescribed degree of precision. This set contains the maps that satisfy a property called preservation of the unfolding equivalence, and includes most of the practically useful functions on graphs; the only known exception is when the input graph contains particular patterns of symmetries when unfolding equivalence may not be preserved. The result can be considered an extension of the universal approximation property established for the classic feedforward neural networks (FNNs). Some experimental examples are used to show the computational capabilities of the proposed model.

  3. Computer network security and cyber ethics

    CERN Document Server

    Kizza, Joseph Migga

    2014-01-01

    In its 4th edition, this book remains focused on increasing public awareness of the nature and motives of cyber vandalism and cybercriminals, the weaknesses inherent in cyberspace infrastructure, and the means available to protect ourselves and our society. This new edition aims to integrate security education and awareness with discussions of morality and ethics. The reader will gain an understanding of how the security of information in general and of computer networks in particular, on which our national critical infrastructure and, indeed, our lives depend, is based squarely on the individ

  4. Some queuing network models of computer systems

    Science.gov (United States)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  5. Computer vulnerability risk analysis.

    OpenAIRE

    2008-01-01

    The discussions presented in this dissertation have been undertaken in answer to the need for securing the intellectual assets stored on computer systems. Computer vulnerabilities and their influence on computer systems and the intellectual assets they possess are the main focus of this research. In an effort to portray the influence of vulnerabilities on a computer system, a method for assigning a measure of risk to individual vulnerabilities is proposed. This measure of risk, in turn, gives...

  6. WEB BASED LEARNING OF COMPUTER NETWORK COURSE

    Directory of Open Access Journals (Sweden)

    Hakan KAPTAN

    2004-04-01

    Full Text Available As a result of developing on Internet and computer fields, web based education becomes one of the area that many improving and research studies are done. In this study, web based education materials have been explained for multimedia animation and simulation aided Computer Networks course in Technical Education Faculties. Course content is formed by use of university course books, web based education materials and technology web pages of companies. Course content is formed by texts, pictures and figures to increase motivation of students and facilities of learning some topics are supported by animations. Furthermore to help working principles of routing algorithms and congestion control algorithms simulators are constructed in order to interactive learning

  7. Choice Of Computer Networking Cables And Their Effect On Data ...

    African Journals Online (AJOL)

    Computer networking is the order of the day in this Information and Communication Technology (ICT) age. Although a network can be through a wireless device most local connections are done using cables. There are three main computer-networking cables namely coaxial cable, unshielded twisted pair cable and the optic ...

  8. Extending Stochastic Network Calculus to Loss Analysis

    Directory of Open Access Journals (Sweden)

    Chao Luo

    2013-01-01

    Full Text Available Loss is an important parameter of Quality of Service (QoS. Though stochastic network calculus is a very useful tool for performance evaluation of computer networks, existing studies on stochastic service guarantees mainly focused on the delay and backlog. Some efforts have been made to analyse loss by deterministic network calculus, but there are few results to extend stochastic network calculus for loss analysis. In this paper, we introduce a new parameter named loss factor into stochastic network calculus and then derive the loss bound through the existing arrival curve and service curve via this parameter. We then prove that our result is suitable for the networks with multiple input flows. Simulations show the impact of buffer size, arrival traffic, and service on the loss factor.

  9. Computational study of noise in a large signal transduction network

    Directory of Open Access Journals (Sweden)

    Ruohonen Keijo

    2011-06-01

    Full Text Available Abstract Background Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. Results We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. Conclusions We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies.

  10. NEAT: an efficient network enrichment analysis test.

    Science.gov (United States)

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-09-05

    Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).

  11. Computational Aspects of Sensor Network Protocols (Distributed Sensor Network Simulator

    Directory of Open Access Journals (Sweden)

    Vasanth Iyer

    2009-08-01

    Full Text Available In this work, we model the sensor networks as an unsupervised learning and clustering process. We classify nodes according to its static distribution to form known class densities (CCPD. These densities are chosen from specific cross-layer features which maximizes lifetime of power-aware routing algorithms. To circumvent computational complexities of a power-ware communication STACK we introduce path-loss models at the nodes only for high density deployments. We study the cluster heads and formulate the data handling capacity for an expected deployment and use localized probability models to fuse the data with its side information before transmission. So each cluster head has a unique Pmax but not all cluster heads have the same measured value. In a lossless mode if there are no faults in the sensor network then we can show that the highest probability given by Pmax is ambiguous if its frequency is ≤ n/2 otherwise it can be determined by a local function. We further show that the event detection at the cluster heads can be modelled with a pattern 2m and m, the number of bits can be a correlated pattern of 2 bits and for a tight lower bound we use 3-bit Huffman codes which have entropy < 1. These local algorithms are further studied to optimize on power, fault detection and to maximize on the distributed routing algorithm used at the higher layers. From these bounds in large network, it is observed that the power dissipation is network size invariant. The performance of the routing algorithms solely based on success of finding healthy nodes in a large distribution. It is also observed that if the network size is kept constant and the density of the nodes is kept closer then the local pathloss model effects the performance of the routing algorithms. We also obtain the maximum intensity of transmitting nodes for a given category of routing algorithms for an outage constraint, i.e., the lifetime of sensor network.

  12. On Distributed Computation in Noisy Random Planar Networks

    OpenAIRE

    Kanoria, Y.; Manjunath, D.

    2007-01-01

    We consider distributed computation of functions of distributed data in random planar networks with noisy wireless links. We present a new algorithm for computation of the maximum value which is order optimal in the number of transmissions and computation time.We also adapt the histogram computation algorithm of Ying et al to make the histogram computation time optimal.

  13. Application of artificial neural networks in computer-aided diagnosis.

    Science.gov (United States)

    Liu, Bei

    2015-01-01

    Computer-aided diagnosis is a diagnostic procedure in which a radiologist uses the outputs of computer analysis of medical images as a second opinion in the interpretation of medical images, either to help with lesion detection or to help determine if the lesion is benign or malignant. Artificial neural networks (ANNs) are usually employed to formulate the statistical models for computer analysis. Receiver operating characteristic curves are used to evaluate the performance of the ANN alone, as well as the diagnostic performance of radiologists who take into account the ANN output as a second opinion. In this chapter, we use mammograms to illustrate how an ANN model is trained, tested, and evaluated, and how a radiologist should use the ANN output as a second opinion in CAD.

  14. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  15. Biological networks 101: computational modeling for molecular biologists.

    Science.gov (United States)

    Scholma, Jetse; Schivo, Stefano; Urquidi Camacho, Ricardo A; van de Pol, Jaco; Karperien, Marcel; Post, Janine N

    2014-01-01

    Computational modeling of biological networks permits the comprehensive analysis of cells and tissues to define molecular phenotypes and novel hypotheses. Although a large number of software tools have been developed, the versatility of these tools is limited by mathematical complexities that prevent their broad adoption and effective use by molecular biologists. This study clarifies the basic aspects of molecular modeling, how to convert data into useful input, as well as the number of time points and molecular parameters that should be considered for molecular regulatory models with both explanatory and predictive potential. We illustrate the necessary experimental preconditions for converting data into a computational model of network dynamics. This model requires neither a thorough background in mathematics nor precise data on intracellular concentrations, binding affinities or reaction kinetics. Finally, we show how an interactive model of crosstalk between signal transduction pathways in primary human articular chondrocytes allows insight into processes that regulate gene expression. © 2013 Elsevier B.V. All rights reserved.

  16. Social network analysis in medical education.

    Science.gov (United States)

    Isba, Rachel; Woolf, Katherine; Hanneman, Robert

    2017-01-01

    Humans are fundamentally social beings. The social systems within which we live our lives (families, schools, workplaces, professions, friendship groups) have a significant influence on our health, success and well-being. These groups can be characterised as networks and analysed using social network analysis. Social network analysis is a mainly quantitative method for analysing how relationships between individuals form and affect those individuals, but also how individual relationships build up into wider social structures that influence outcomes at a group level. Recent increases in computational power have increased the accessibility of social network analysis methods for application to medical education research. Social network analysis has been used to explore team-working, social influences on attitudes and behaviours, the influence of social position on individual success, and the relationship between social cohesion and power. This makes social network analysis theories and methods relevant to understanding the social processes underlying academic performance, workplace learning and policy-making and implementation in medical education contexts. Social network analysis is underused in medical education, yet it is a method that could yield significant insights that would improve experiences and outcomes for medical trainees and educators, and ultimately for patients. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  17. Network Analysis, Architecture, and Design

    CERN Document Server

    McCabe, James D

    2007-01-01

    Traditionally, networking has had little or no basis in analysis or architectural development, with designers relying on technologies they are most familiar with or being influenced by vendors or consultants. However, the landscape of networking has changed so that network services have now become one of the most important factors to the success of many third generation networks. It has become an important feature of the designer's job to define the problems that exist in his network, choose and analyze several optimization parameters during the analysis process, and then prioritize and evalua

  18. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  19. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  20. Applications of social media and social network analysis

    CERN Document Server

    Kazienko, Przemyslaw

    2015-01-01

    This collection of contributed chapters demonstrates a wide range of applications within two overlapping research domains: social media analysis and social network analysis. Various methodologies were utilized in the twelve individual chapters including static, dynamic and real-time approaches to graph, textual and multimedia data analysis. The topics apply to reputation computation, emotion detection, topic evolution, rumor propagation, evaluation of textual opinions, friend ranking, analysis of public transportation networks, diffusion in dynamic networks, analysis of contributors to commun

  1. 1st International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2013-01-01

    This volume contains a selection of contributions from the "First International Conference in Network Analysis," held at the University of Florida, Gainesville, on December 14-16, 2011. The remarkable diversity of fields that take advantage of Network Analysis makes the endeavor of gathering up-to-date material in a single compilation a useful, yet very difficult, task. The purpose of this volume is to overcome this difficulty by collecting the major results found by the participants and combining them in one easily accessible compilation. Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network is bringing together researchers, practitioners and other scientific communities from numerous fields such as Operations Research, Computer Science, Transportation, Energy, Social Sciences, and more. The contributions not only come from different fields, but also cover a broad range of topics relevant to the...

  2. 2013 International Conference on Computer Engineering and Network

    CERN Document Server

    Zhu, Tingshao

    2014-01-01

    This book aims to examine innovation in the fields of computer engineering and networking. The book covers important emerging topics in computer engineering and networking, and it will help researchers and engineers improve their knowledge of state-of-art in related areas. The book presents papers from The Proceedings of the 2013 International Conference on Computer Engineering and Network (CENet2013) which was held on July 20-21, in Shanghai, China.

  3. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  4. Network topology analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kalb, Jeffrey L.; Lee, David S.

    2008-01-01

    Emerging high-bandwidth, low-latency network technology has made network-based architectures both feasible and potentially desirable for use in satellite payload architectures. The selection of network topology is a critical component when developing these multi-node or multi-point architectures. This study examines network topologies and their effect on overall network performance. Numerous topologies were reviewed against a number of performance, reliability, and cost metrics. This document identifies a handful of good network topologies for satellite applications and the metrics used to justify them as such. Since often multiple topologies will meet the requirements of the satellite payload architecture under development, the choice of network topology is not easy, and in the end the choice of topology is influenced by both the design characteristics and requirements of the overall system and the experience of the developer.

  5. Medical image analysis with artificial neural networks.

    Science.gov (United States)

    Jiang, J; Trundle, P; Ren, J

    2010-12-01

    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    Science.gov (United States)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  7. DETECTING NETWORK ATTACKS IN COMPUTER NETWORKS BY USING DATA MINING METHODS

    OpenAIRE

    Platonov, V. V.; Semenov, P. O.

    2016-01-01

    The article describes an approach to the development of an intrusion detection system for computer networks. It is shown that the usage of several data mining methods and tools can improve the efficiency of protection computer networks against network at-tacks due to the combination of the benefits of signature detection and anomalies detection and the opportunity of adaptation the sys-tem for hardware and software structure of the computer network.

  8. Email networks and the spread of computer viruses

    Science.gov (United States)

    Newman, M. E.; Forrest, Stephanie; Balthrop, Justin

    2002-09-01

    Many computer viruses spread via electronic mail, making use of computer users' email address books as a source for email addresses of new victims. These address books form a directed social network of connections between individuals over which the virus spreads. Here we investigate empirically the structure of this network using data drawn from a large computer installation, and discuss the implications of this structure for the understanding and prevention of computer virus epidemics.

  9. An Overview of Computer Network security and Research Technology

    OpenAIRE

    Rathore, Vandana

    2016-01-01

    The rapid development in the field of computer networks and systems brings both convenience and security threats for users. Security threats include network security and data security. Network security refers to the reliability, confidentiality, integrity and availability of the information in the system. The main objective of network security is to maintain the authenticity, integrity, confidentiality, availability of the network. This paper introduces the details of the technologies used in...

  10. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  11. Discussion on the Technology and Method of Computer Network Security Management

    Science.gov (United States)

    Zhou, Jianlei

    2017-09-01

    With the rapid development of information technology, the application of computer network technology has penetrated all aspects of society, changed people's way of life work to a certain extent, brought great convenience to people. But computer network technology is not a panacea, it can promote the function of social development, but also can cause damage to the community and the country. Due to computer network’ openness, easiness of sharing and other characteristics, it had a very negative impact on the computer network security, especially the loopholes in the technical aspects can cause damage on the network information. Based on this, this paper will do a brief analysis on the computer network security management problems and security measures.

  12. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  13. Network Patch Cables Demystified: A Super Activity for Computer Networking Technology

    Science.gov (United States)

    Brown, Douglas L.

    2004-01-01

    This article de-mystifies network patch cable secrets so that people can connect their computers and transfer those pesky files--without screaming at the cables. It describes a network cabling activity that can offer students a great hands-on opportunity for working with the tools, techniques, and media used in computer networking. Since the…

  14. Tourism Destinations Network Analysis, Social Network Analysis Approach

    Directory of Open Access Journals (Sweden)

    2015-09-01

    Full Text Available The tourism industry is becoming one of the world's largest economical sources, and is expected to become the world's first industry by 2020. Previous studies have focused on several aspects of this industry including sociology, geography, tourism management and development, but have paid less attention to analytical and quantitative approaches. This study introduces some network analysis techniques and measures aiming at studying the structural characteristics of tourism networks. More specifically, it presents a methodology to analyze tourism destinations network. We apply the methodology to analyze mazandaran’s Tourism destination network, one of the most famous tourism areas of Iran.

  15. Throughput capacity computation model for hybrid wireless networks

    African Journals Online (AJOL)

    wireless networks. We present in this paper, a computational model for obtaining throughput capacity for hybrid wireless networks. For a hybrid network with n nodes and m base stations, we observe through simulation that the throughput capacity increases linearly with the base station infrastructure connected by the wired ...

  16. Novel Ethernet Based Optical Local Area Networks for Computer Interconnection

    NARCIS (Netherlands)

    Radovanovic, Igor; van Etten, Wim; Taniman, R.O.; Kleinkiskamp, Ronny

    2003-01-01

    In this paper we present new optical local area networks for fiber-to-the-desk application. Presented networks are expected to bring a solution for having optical fibers all the way to computers. To bring the overall implementation costs down we have based our networks on short-wavelength optical

  17. 4th International Conference on Computer Engineering and Networks

    CERN Document Server

    2015-01-01

    This book aims to examine innovation in the fields of computer engineering and networking. The book covers important emerging topics in computer engineering and networking, and it will help researchers and engineers improve their knowledge of state-of-art in related areas. The book presents papers from the 4th International Conference on Computer Engineering and Networks (CENet2014) held July 19-20, 2014 in Shanghai, China.  ·       Covers emerging topics for computer engineering and networking ·       Discusses how to improve productivity by using the latest advanced technologies ·       Examines innovation in the fields of computer engineering and networking  

  18. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.

    Science.gov (United States)

    Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T

    2015-07-15

    While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks

  19. Second International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Konar, Amit; Chakraborty, Aruna

    2014-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two-volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 148 scholarly papers, which have been accepted for presentation from over 640 submissions in the second International Conference on Advanced Computing, Networking and Informatics, 2014, held in Kolkata, India during June 24-26, 2014. The first volume includes innovative computing techniques and relevant research results in informatics with selective applications in pattern recognition, signal/image process...

  20. Integrating Network Awareness in ATLAS Distributed Computing Using the ANSE Project

    CERN Document Server

    Klimentov, Alexei; The ATLAS collaboration; Petrosyan, Artem; Batista, Jorge Horacio; Mc Kee, Shawn Patrick

    2015-01-01

    A crucial contributor to the success of the massively scaled global computing system that delivers the analysis needs of the LHC experiments is the networking infrastructure upon which the system is built. The experiments have been able to exploit excellent high-bandwidth networking in adapting their computing models for the most efficient utilization of resources. New advanced networking technologies now becoming available such as software defined networking hold the potential of further leveraging the network to optimize workflows and dataflows, through proactive control of the network fabric on the part of high level applications such as experiment workload management and data management systems. End to end monitoring of networks using perfSONAR combined with data flow performance metrics further allows applications to adapt based on real time conditions. We will describe efforts underway in ATLAS on integrating network awareness at the application level, particularly in workload management, building upon ...

  1. Computer analysis of Holter electrocardiogram.

    Science.gov (United States)

    Yanaga, T; Adachi, M; Sato, Y; Ichimaru, Y; Otsuka, K

    1994-10-01

    Computer analysis is indispensable for the interpretation of Holter ECG, because it includes a large quantity of data. Computer analysis of Holter ECG is similar to that of conventional ECG, however, in computer analysis of Holter ECG, there are some difficulties such as many noise, limited analyzing time and voluminous data. The main topics in computer analysis of Holter ECG will be arrhythmias, ST-T changes, heart rate variability, QT interval, late potential and construction of database. Although many papers have been published on the computer analysis of Holter ECG, some of the papers was reviewed briefly in the present paper. We have studied on computer analysis of VPCs, ST-T changes, heart rate variability, QT interval and Cheyne-Stokes respiration during 24-hour ambulatory ECG monitoring. Further, we have studied on ambulatory palmar sweating for the evaluation of mental stress during a day. In future, the development of "the integrated Holter system", which enables the evaluation of ventricular vulnerability and modulating factor such as psychoneural hypersensitivity may be important.

  2. Computer image analysis in obtaining characteristics of images: greenhouse tomatoes in the process of generating learning sets of artificial neural networks

    Science.gov (United States)

    Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.

    2014-04-01

    The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.

  3. Network selection, Information filtering and Scalable computation

    Science.gov (United States)

    Ye, Changqing

    This dissertation explores two application scenarios of sparsity pursuit method on large scale data sets. The first scenario is classification and regression in analyzing high dimensional structured data, where predictors corresponds to nodes of a given directed graph. This arises in, for instance, identification of disease genes for the Parkinson's diseases from a network of candidate genes. In such a situation, directed graph describes dependencies among the genes, where direction of edges represent certain causal effects. Key to high-dimensional structured classification and regression is how to utilize dependencies among predictors as specified by directions of the graph. In this dissertation, we develop a novel method that fully takes into account such dependencies formulated through certain nonlinear constraints. We apply the proposed method to two applications, feature selection in large margin binary classification and in linear regression. We implement the proposed method through difference convex programming for the cost function and constraints. Finally, theoretical and numerical analyses suggest that the proposed method achieves the desired objectives. An application to disease gene identification is presented. The second application scenario is personalized information filtering which extracts the information specifically relevant to a user, predicting his/her preference over a large number of items, based on the opinions of users who think alike or its content. This problem is cast into the framework of regression and classification, where we introduce novel partial latent models to integrate additional user-specific and content-specific predictors, for higher predictive accuracy. In particular, we factorize a user-over-item preference matrix into a product of two matrices, each representing a user's preference and an item preference by users. Then we propose a likelihood method to seek a sparsest latent factorization, from a class of over

  4. Developing an intelligence analysis process through social network analysis

    Science.gov (United States)

    Waskiewicz, Todd; LaMonica, Peter

    2008-04-01

    Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.

  5. Effect of anti-virus software on infectious nodes in computer network: A mathematical model

    Science.gov (United States)

    Mishra, Bimal Kumar; Pandey, Samir Kumar

    2012-07-01

    An e-epidemic model of malicious codes in the computer network through vertical transmission is formulated. We have observed that if the basic reproduction number is less than unity, the infected proportion of computer nodes disappear and malicious codes die out and also the malicious codes-free equilibrium is globally asymptotically stable which leads to its eradication. Effect of anti-virus software on the removal of the malicious codes from the computer network is critically analyzed. Analysis and simulation results show some managerial insights that are helpful for the practice of anti-virus in information sharing networks.

  6. 3rd International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Chaki, Nabendu

    2016-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

  7. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  8. Risk, Privacy, and Security in Computer Networks

    OpenAIRE

    Årnes, Andre

    2006-01-01

    With an increasingly digitally connected society comes complexity, uncertainty, and risk. Network monitoring, incident management, and digital forensics is of increasing importance with the escalation of cybercrime and other network supported serious crimes. New laws and regulations governing electronic communications, cybercrime, and data retention are being proposed, continuously requiring new methods and tools. This thesis introduces a novel approach to real-time network risk assessmen...

  9. Topological analysis of telecommunications networks

    Directory of Open Access Journals (Sweden)

    Milojko V. Jevtović

    2011-01-01

    Full Text Available A topological analysis of the structure of telecommunications networks is a very interesting topic in the network research, but also a key issue in their design and planning. Satisfying multiple criteria in terms of locations of switching nodes as well as their connectivity with respect to the requests for capacity, transmission speed, reliability, availability and cost are the main research objectives. There are three ways of presenting the topology of telecommunications networks: table, matrix or graph method. The table method is suitable for a network of a relatively small number of nodes in relation to the number of links. The matrix method involves the formation of a connection matrix in which its columns present source traffic nodes and its rows are the switching systems that belong to the destination. The method of the topology graph means that the network nodes are connected via directional or unidirectional links. We can thus easily analyze the structural parameters of telecommunications networks. This paper presents the mathematical analysis of the star-, ring-, fully connected loop- and grid (matrix-shaped topology as well as the topology based on the shortest path tree. For each of these topologies, the expressions for determining the number of branches, the middle level of reliability, the medium length and the average length of the link are given in tables. For the fully connected loop network with five nodes the values of all topological parameters are calculated. Based on the topological parameters, the relationships that represent integral and distributed indicators of reliability are given in this work as well as the values of the particular network. The main objectives of the topology optimization of telecommunications networks are: achieving the minimum complexity, maximum capacity, the shortest path message transfer, the maximum speed of communication and maximum economy. The performance of telecommunications networks is

  10. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  11. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  12. Computing properties of stable configurations of thermodynamic binding networks

    OpenAIRE

    Breik, Keenan; Prakash, Lakshmi; Thachuk, Chris; Heule, Marijn; Soloveichik, David

    2017-01-01

    Models of molecular computing generally embed computation in kinetics--the specific time evolution of a chemical system. However, if the desired output is not thermodynamically stable, basic physical chemistry dictates that thermodynamic forces will drive the system toward error throughout the computation. The Thermodynamic Binding Network (TBN) model was introduced to formally study how the thermodynamic equilibrium can be made consistent with the desired computation, and it idealizes bindin...

  13. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    Science.gov (United States)

    1994-08-10

    23 Haddock, J. and O’Keefe, R., "Using Artificial Intelligence to Facilitate Manufacturing Systems Simulation," Computers & Industrial Engineering , Vol...Feedforward Neural Networks," Computers & Industrial Engineering , Vol. 21, No. 1- 4, (1991), pp. 247-251. 87 Proceedings of the 1992 Summer Computer...Using Simulation Experiments," Computers & Industrial Engineering , Vol. 22, No. 2 (1992), pp. 195-209. 119 Kuei, C. and Madu, C., "Polynomial

  14. Wireless Networks: New Meaning to Ubiquitous Computing.

    Science.gov (United States)

    Drew, Wilfred, Jr.

    2003-01-01

    Discusses the use of wireless technology in academic libraries. Topics include wireless networks; standards (IEEE 802.11); wired versus wireless; why libraries implement wireless technology; wireless local area networks (WLANs); WLAN security; examples of wireless use at Indiana State University and Morrisville College (New York); and useful…

  15. Methodologies and techniques for analysis of network flow data

    Energy Technology Data Exchange (ETDEWEB)

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  16. CFD Optimization on Network-Based Parallel Computer System

    Science.gov (United States)

    Cheung, Samson H.; VanDalsem, William (Technical Monitor)

    1994-01-01

    Combining multiple engineering workstations into a network-based heterogeneous parallel computer allows application of aerodynamic optimization with advance computational fluid dynamics codes, which is computationally expensive in mainframe supercomputer. This paper introduces a nonlinear quasi-Newton optimizer designed for this network-based heterogeneous parallel computer on a software called Parallel Virtual Machine. This paper will introduce the methodology behind coupling a Parabolized Navier-Stokes flow solver to the nonlinear optimizer. This parallel optimization package has been applied to reduce the wave drag of a body of revolution and a wing/body configuration with results of 5% to 6% drag reduction.

  17. Phoebus: Network Middleware for Next-Generation Network Computing

    Energy Technology Data Exchange (ETDEWEB)

    Martin Swany

    2012-06-16

    The Phoebus project investigated algorithms, protocols, and middleware infrastructure to improve end-to-end performance in high speed, dynamic networks. The Phoebus system essentially serves as an adaptation point for networks with disparate capabilities or provisioning. This adaptation can take a variety of forms including acting as a provisioning agent across multiple signaling domains, providing transport protocol adaptation points, and mapping between distributed resource reservation paradigms and the optical network control plane. We have successfully developed the system and demonstrated benefits. The Phoebus system was deployed in Internet2 and in ESnet, as well as in GEANT2, RNP in Brazil and over international links to Korea and Japan. Phoebus is a system that implements a new protocol and associated forwarding infrastructure for improving throughput in high-speed dynamic networks. It was developed to serve the needs of large DOE applications on high-performance networks. The idea underlying the Phoebus model is to embed Phoebus Gateways (PGs) in the network as on-ramps to dynamic circuit networks. The gateways act as protocol translators that allow legacy applications to use dedicated paths with high performance.

  18. Analysis of neural networks

    CERN Document Server

    Heiden, Uwe

    1980-01-01

    The purpose of this work is a unified and general treatment of activity in neural networks from a mathematical pOint of view. Possible applications of the theory presented are indica­ ted throughout the text. However, they are not explored in de­ tail for two reasons : first, the universal character of n- ral activity in nearly all animals requires some type of a general approach~ secondly, the mathematical perspicuity would suffer if too many experimental details and empirical peculiarities were interspersed among the mathematical investigation. A guide to many applications is supplied by the references concerning a variety of specific issues. Of course the theory does not aim at covering all individual problems. Moreover there are other approaches to neural network theory (see e.g. Poggio-Torre, 1978) based on the different lev­ els at which the nervous system may be viewed. The theory is a deterministic one reflecting the average be­ havior of neurons or neuron pools. In this respect the essay is writt...

  19. Computationally Efficient Neural Network Intrusion Security Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Todd Vollmer; Milos Manic

    2009-08-01

    An enhanced version of an algorithm to provide anomaly based intrusion detection alerts for cyber security state awareness is detailed. A unique aspect is the training of an error back-propagation neural network with intrusion detection rule features to provide a recognition basis. Network packet details are subsequently provided to the trained network to produce a classification. This leverages rule knowledge sets to produce classifications for anomaly based systems. Several test cases executed on ICMP protocol revealed a 60% identification rate of true positives. This rate matched the previous work, but 70% less memory was used and the run time was reduced to less than 1 second from 37 seconds.

  20. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social...... and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...

  1. Evolving ATLAS Computing For Today’s Networks

    CERN Document Server

    Campana, S; The ATLAS collaboration; Jezequel, S; Negri, G; Serfon, C; Ueda, I

    2012-01-01

    The ATLAS computing infrastructure was designed many years ago based on the assumption of rather limited network connectivity between computing centres. ATLAS sites have been organized in a hierarchical model, where only a static subset of all possible network links can be exploited and a static subset of well connected sites (CERN and the T1s) can cover important functional roles such as hosting master copies of the data. The pragmatic adoption of such simplified approach, in respect of a more relaxed scenario interconnecting all sites, was very beneficial during the commissioning of the ATLAS distributed computing system and essential in reducing the operational cost during the first two years of LHC data taking. In the mean time, networks evolved far beyond this initial scenario: while a few countries are still poorly connected with the rest of the WLCG infrastructure, most of the ATLAS computing centres are now efficiently interlinked. Our operational experience in running the computing infrastructure in ...

  2. Networks and Project Work: Alternative Pedagogies for Writing with Computers.

    Science.gov (United States)

    Susser, Bernard

    1993-01-01

    Describes three main uses of computers for writing as a social activity: networking, telecommunications, and project work. Examines advantages and disadvantages of teaching writing on a network. Argues that reports in the literature and the example of an English as a foreign language writing class show that project work shares most of the…

  3. Computer Networking Strategies for Building Collaboration among Science Educators.

    Science.gov (United States)

    Aust, Ronald

    The development and dissemination of science materials can be associated with technical delivery systems such as the Unified Network for Informatics in Teacher Education (UNITE). The UNITE project was designed to investigate ways for using computer networking to improve communications and collaboration among university schools of education and…

  4. The metabolic network of Clostridium acetobutylicum: Comparison of the approximate Bayesian computation via sequential Monte Carlo (ABC-SMC) and profile likelihood estimation (PLE) methods for determinability analysis.

    Science.gov (United States)

    Thorn, Graeme J; King, John R

    2016-01-01

    The Gram-positive bacterium Clostridium acetobutylicum is an anaerobic endospore-forming species which produces acetone, butanol and ethanol via the acetone-butanol (AB) fermentation process, leading to biofuels including butanol. In previous work we looked to estimate the parameters in an ordinary differential equation model of the glucose metabolism network using data from pH-controlled continuous culture experiments. Here we combine two approaches, namely the approximate Bayesian computation via an existing sequential Monte Carlo (ABC-SMC) method (to compute credible intervals for the parameters), and the profile likelihood estimation (PLE) (to improve the calculation of confidence intervals for the same parameters), the parameters in both cases being derived from experimental data from forward shift experiments. We also apply the ABC-SMC method to investigate which of the models introduced previously (one non-sporulation and four sporulation models) have the greatest strength of evidence. We find that the joint approximate posterior distribution of the parameters determines the same parameters as previously, including all of the basal and increased enzyme production rates and enzyme reaction activity parameters, as well as the Michaelis-Menten kinetic parameters for glucose ingestion, while other parameters are not as well-determined, particularly those connected with the internal metabolites acetyl-CoA, acetoacetyl-CoA and butyryl-CoA. We also find that the approximate posterior is strongly non-Gaussian, indicating that our previous assumption of elliptical contours of the distribution is not valid, which has the effect of reducing the numbers of pairs of parameters that are (linearly) correlated with each other. Calculations of confidence intervals using the PLE method back this up. Finally, we find that all five of our models are equally likely, given the data available at present. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Phylodynamic analysis of a viral infection network

    Directory of Open Access Journals (Sweden)

    Teiichiro eShiino

    2012-07-01

    Full Text Available Viral infections by sexual and droplet transmission routes typically spread through a complex host-to-host contact network. Clarifying the transmission network and epidemiological parameters affecting the variations and dynamics of a specific pathogen is a major issue in the control of infectious diseases. However, conventional methods such as interview and/or classical phylogenetic analysis of viral gene sequences have inherent limitations and often fail to detect infectious clusters and transmission connections. Recent improvements in computational environments now permit the analysis of large datasets. In addition, novel analytical methods have been developed that serve to infer the evolutionary dynamics of virus genetic diversity using sample date information and sequence data. This type of framework, termed phylodynamics, helps connect some of the missing links on viral transmission networks, which are often hard to detect by conventional methods of epidemiology. With sufficient number of sequences available, one can use this new inference method to estimate theoretical epidemiological parameters such as temporal distributions of the primary infection, fluctuation of the pathogen population size, basic reproductive number, and the mean time span of disease infectiousness. Transmission networks estimated by this framework often have the properties of a scale-free network, which are characteristic of infectious and social communication processes. Network analysis based on phylodynamics has alluded to various suggestions concerning the infection dynamics associated with a given community and/or risk behavior. In this review, I will summarize the current methods available for identifying the transmission network using phylogeny, and present an argument on the possibilities of applying the scale-free properties to these existing frameworks.

  6. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  7. Network analysis and synthesis a modern systems theory approach

    CERN Document Server

    Anderson, Brian D O

    2006-01-01

    Geared toward upper-level undergraduates and graduate students, this book offers a comprehensive look at linear network analysis and synthesis. It explores state-space synthesis as well as analysis, employing modern systems theory to unite the classical concepts of network theory. The authors stress passive networks but include material on active networks. They avoid topology in dealing with analysis problems and discuss computational techniques. The concepts of controllability, observability, and degree are emphasized in reviewing the state-variable description of linear systems. Explorations

  8. Neuromorphic computing applications for network intrusion detection systems

    Science.gov (United States)

    Garcia, Raymond C.; Pino, Robinson E.

    2014-05-01

    What is presented here is a sequence of evolving concepts for network intrusion detection. These concepts start with neuromorphic structures for XOR-based signature matching and conclude with computationally based network intrusion detection system with an autonomous structuring algorithm. There is evidence that neuromorphic computation for network intrusion detection is fractal in nature under certain conditions. Specifically, the neural structure can take fractal form when simple neural structuring is autonomous. A neural structure is fractal by definition when its fractal dimension exceeds the synaptic matrix dimension. The authors introduce the use of fractal dimension of the neuromorphic structure as a factor in the autonomous restructuring feedback loop.

  9. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    2013-01-01

    that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...... that adopt different approaches to computing the query. Algorithm AUG uses graph augmentation, and ITE uses iterative road-network partitioning. Empirical studies with real data sets demonstrate that the algorithms are capable of offering high performance in realistic settings....

  10. An Analysis of the Structure and Evolution of Networks

    Science.gov (United States)

    Hua, Guangying

    2011-01-01

    As network research receives more and more attention from both academic researchers and practitioners, network analysis has become a fast growing field attracting many researchers from diverse fields such as physics, computer science, and sociology. This dissertation provides a review of theory and research on different real data sets from the…

  11. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  12. NEAT : an efficient network enrichment analysis test

    NARCIS (Netherlands)

    Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C

    2016-01-01

    BACKGROUND: Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be

  13. Optical interconnection networks for high-performance computing systems.

    Science.gov (United States)

    Biberman, Aleksandr; Bergman, Keren

    2012-04-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers.

  14. Active system area networks for data intensive computations. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-04-01

    The goal of the Active System Area Networks (ASAN) project is to develop hardware and software technologies for the implementation of active system area networks (ASANs). The use of the term ''active'' refers to the ability of the network interfaces to perform application-specific as well as system level computations in addition to their traditional role of data transfer. This project adopts the view that the network infrastructure should be an active computational entity capable of supporting certain classes of computations that would otherwise be performed on the host CPUs. The result is a unique network-wide programming model where computations are dynamically placed within the host CPUs or the NIs depending upon the quality of service demands and network/CPU resource availability. The projects seeks to demonstrate that such an approach is a better match for data intensive network-based applications and that the advent of low-cost powerful embedded processors and configurable hardware makes such an approach economically viable and desirable.

  15. Console Networks for Major Computer Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ophir, D; Shepherd, B; Spinrad, R J; Stonehill, D

    1966-07-22

    A concept for interactive time-sharing of a major computer system is developed in which satellite computers mediate between the central computing complex and the various individual user terminals. These techniques allow the development of a satellite system substantially independent of the details of the central computer and its operating system. Although the user terminals' roles may be rich and varied, the demands on the central facility are merely those of a tape drive or similar batched information transfer device. The particular system under development provides service for eleven visual display and communication consoles, sixteen general purpose, low rate data sources, and up to thirty-one typewriters. Each visual display provides a flicker-free image of up to 4000 alphanumeric characters or tens of thousands of points by employing a swept raster picture generating technique directly compatible with that of commercial television. Users communicate either by typewriter or a manually positioned light pointer.

  16. Electromagnetic field computation by network methods

    CERN Document Server

    Felsen, Leopold B; Russer, Peter

    2009-01-01

    This monograph proposes a systematic and rigorous treatment of electromagnetic field representations in complex structures. The book presents new strong models by combining important computational methods. This is the last book of the late Leopold Felsen.

  17. Realistic computer network simulation for network intrusion detection dataset generation

    Science.gov (United States)

    Payer, Garrett

    2015-05-01

    The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.

  18. 1st International Conference on Signal, Networks, Computing, and Systems

    CERN Document Server

    Mohapatra, Durga; Nagar, Atulya; Sahoo, Manmath

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on Signal, Networks, Computing, and Systems (ICSNCS 2016) held at Jawaharlal Nehru University, New Delhi, India during February 25–27, 2016. The book is organized in to two volumes and primarily focuses on theory and applications in the broad areas of communication technology, computer science and information security. The book aims to bring together the latest scientific research works of academic scientists, professors, research scholars and students in the areas of signal, networks, computing and systems detailing the practical challenges encountered and the solutions adopted.

  19. Dynamical Systems Theory for Transparent Symbolic Computation in Neuronal Networks

    OpenAIRE

    Carmantini, Giovanni Sirio

    2017-01-01

    In this thesis, we explore the interface between symbolic and dynamical system computation, with particular regard to dynamical system models of neuronal networks. In doing so, we adhere to a definition of computation as the physical realization of a formal system, where we say that a dynamical system performs a computation if a correspondence can be found between its dynamics on a vectorial space and the formal system’s dynamics on a symbolic space. Guided by this definition, we characterize...

  20. CX: A Scalable, Robust Network for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Peter Cappello

    2002-01-01

    Full Text Available CX, a network-based computational exchange, is presented. The system's design integrates variations of ideas from other researchers, such as work stealing, non-blocking tasks, eager scheduling, and space-based coordination. The object-oriented API is simple, compact, and cleanly separates application logic from the logic that supports interprocess communication and fault tolerance. Computations, of course, run to completion in the presence of computational hosts that join and leave the ongoing computation. Such hosts, or producers, use task caching and prefetching to overlap computation with interprocessor communication. To break a potential task server bottleneck, a network of task servers is presented. Even though task servers are envisioned as reliable, the self-organizing, scalable network of n- servers, described as a sibling-connected height-balanced fat tree, tolerates a sequence of n-1 server failures. Tasks are distributed throughout the server network via a simple "diffusion" process. CX is intended as a test bed for research on automated silent auctions, reputation services, authentication services, and bonding services. CX also provides a test bed for algorithm research into network-based parallel computation.

  1. Computer vision in microstructural analysis

    Science.gov (United States)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  2. Integrating Network Management for Cloud Computing Services

    Science.gov (United States)

    2015-06-01

    DeviceConfigIsControl- lable is calculated based on whether the device is powered up, whether the device can be reachable via SSH /Telnet from the management network...lines of C# and C++ code, plus a number of internal libraries . At its core, it is a highly-available RESTful web service with persistent storage. Below

  3. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  4. Dynamic Security Assessment Of Computer Networks In Siem-Systems

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Doynikova

    2015-10-01

    Full Text Available The paper suggests an approach to the security assessment of computer networks. The approach is based on attack graphs and intended for Security Information and Events Management systems (SIEM-systems. Key feature of the approach consists in the application of the multilevel security metrics taxonomy. The taxonomy allows definition of the system profile according to the input data used for the metrics calculation and techniques of security metrics calculation. This allows specification of the security assessment in near real time, identification of previous and future attacker steps, identification of attackers goals and characteristics. A security assessment system prototype is implemented for the suggested approach. Analysis of its operation is conducted for several attack scenarios.

  5. Analysis of Layered Social Networks

    Science.gov (United States)

    2006-09-01

    xiii List of Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv I. Introduction ...Islamiya JP Joint Publication JTC Joint Targeting Cycle KPP Key Player Problem MCDM Multi-Criteria Decision Making MP Mathematical Programming MST...ANALYSIS OF LAYERED SOCIAL NETWORKS I. Introduction “To know them means to eliminate them” - Colonel Mathieu in the movie, Battle of Algiers

  6. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  7. Aggregation algorithm towards large-scale Boolean network analysis

    OpenAIRE

    Zhao, Y.; Kim, J.; Filippone, M.

    2013-01-01

    The analysis of large-scale Boolean network dynamics is of great importance in understanding complex phenomena where systems are characterized by a large number of components. The computational cost to reveal the number of attractors and the period of each attractor increases exponentially as the number of nodes in the networks increases. This paper presents an efficient algorithm to find attractors for medium to large-scale networks. This is achieved by analyzing subnetworks within the netwo...

  8. PAN network analysis program: its development and use

    Energy Technology Data Exchange (ETDEWEB)

    Goldwater, M.H.; Rogers, K.; Turnbull, D.K.

    1976-01-01

    British Gas Corp.'s London Research Station describes a comprehensive, efficient, and flexible computer simulation of pressure and flows in gas networks known as PAN - Program to Analyze Networks. The program is used extensively throughout the BGC system for both design and control of gas transmission grids. Its powerful method of analysis solves network problems quickly and handles complex configurations of compressors and regulators easily.

  9. Low Computational Complexity Network Coding For Mobile Networks

    DEFF Research Database (Denmark)

    Heide, Janus

    2012-01-01

    -flow coding technique. One of the key challenges of this technique is its inherent computational complexity which can lead to high computational load and energy consumption in particular on the mobile platforms that are the target platform in this work. To increase the coding throughput several...... library and will be available for researchers and students in the future. Chapter 1 introduces motivating examples and the state of art when this work commenced. In Chapter 2 selected publications are presented and how their content is related. Chapter 3 presents the main outcome of the work and briefly...

  10. FY 1999 Blue Book: Computing, Information, and Communications: Networked Computing for the 21st Century

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — U.S.research and development R and D in computing, communications, and information technologies has enabled unprecedented scientific and engineering advances,...

  11. THE IMPROVEMENT OF COMPUTER NETWORK PERFORMANCE WITH BANDWIDTH MANAGEMENT IN KEMURNIAN II SENIOR HIGH SCHOOL

    Directory of Open Access Journals (Sweden)

    Bayu Kanigoro

    2012-05-01

    Full Text Available This research describes the improvement of computer network performance with bandwidth management in Kemurnian II Senior High School. The main issue of this research is the absence of bandwidth division on computer, which makes user who is downloading data, the provided bandwidth will be absorbed by the user. It leads other users do not get the bandwidth. Besides that, it has been done IP address division on each room, such as computer, teacher and administration room for supporting learning process in Kemurnian II Senior High School, so wireless network is needed. The method is location observation and interview with related parties in Kemurnian II Senior High School, the network analysis has run and designed a new topology network including the wireless network along with its configuration and separation bandwidth on microtic router and its limitation. The result is network traffic on Kemurnian II Senior High School can be shared evenly to each user; IX and IIX traffic are separated, which improve the speed on network access at school and the implementation of wireless network.Keywords: Bandwidth Management; Wireless Network

  12. WaveJava: Wavelet-based network computing

    Science.gov (United States)

    Ma, Kun; Jiao, Licheng; Shi, Zhuoer

    1997-04-01

    Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.

  13. Dynamic Defensive Posture for Computer Network Defence

    Science.gov (United States)

    2006-12-01

    des algorithmes pour le classement de la sévérité des attaques sur le réseau et des mécanismes permettant d’attribuer une valeur aux éléments...power outages and social engineering attacks. Because it has such a large knowledge base on which to draw, it can reason very thoroughly about network...service attacks, eavesdropping and sniffing attacks on data in transit, or data tampering; more complex still would be models of social engineering

  14. Characterization and Planning for Computer Network Operations

    Science.gov (United States)

    2010-07-01

    Cell phones, personal computers, laptops, and personal digital assistants represent a small number of the technology-based devices used around the...C. Simpson, editors. Assistive Technol- ogy and Artificial Intelligence, Applications in Robotics, User Interfaces and Natural Language Processing...retrieval agents: Experiments with automated web browsing. pages 13–18, 1995. [206] V. A. Siris and F. Papagalou. Application of anomaly detection

  15. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  16. Wirelessly powered sensor networks and computational RFID

    CERN Document Server

    2013-01-01

    The Wireless Identification and Sensing Platform (WISP) is the first of a new class of RF-powered sensing and computing systems.  Rather than being powered by batteries, these sensor systems are powered by radio waves that are either deliberately broadcast or ambient.  Enabled by ongoing exponential improvements in the energy efficiency of microelectronics, RF-powered sensing and computing is rapidly moving along a trajectory from impossible (in the recent past), to feasible (today), toward practical and commonplace (in the near future). This book is a collection of key papers on RF-powered sensing and computing systems including the WISP.  Several of the papers grew out of the WISP Challenge, a program in which Intel Corporation donated WISPs to academic applicants who proposed compelling WISP-based projects.  The book also includes papers presented at the first WISP Summit, a workshop held in Berkeley, CA in association with the ACM Sensys conference, as well as other relevant papers. The book provides ...

  17. Six Networks on a Universal Neuromorphic Computing Substrate

    Science.gov (United States)

    Pfeil, Thomas; Grübl, Andreas; Jeltsch, Sebastian; Müller, Eric; Müller, Paul; Petrovici, Mihai A.; Schmuker, Michael; Brüderle, Daniel; Schemmel, Johannes; Meier, Karlheinz

    2013-01-01

    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality. PMID:23423583

  18. Computing Path Tables for Quickest Multipaths In Computer Networks

    Energy Technology Data Exchange (ETDEWEB)

    Grimmell, W.C.

    2004-12-21

    We consider the transmission of a message from a source node to a terminal node in a network with n nodes and m links where the message is divided into parts and each part is transmitted over a different path in a set of paths from the source node to the terminal node. Here each link is characterized by a bandwidth and delay. The set of paths together with their transmission rates used for the message is referred to as a multipath. We present two algorithms that produce a minimum-end-to-end message delay multipath path table that, for every message length, specifies a multipath that will achieve the minimum end-to-end delay. The algorithms also generate a function that maps the minimum end-to-end message delay to the message length. The time complexities of the algorithms are O(n{sup 2}((n{sup 2}/logn) + m)min(D{sub max}, C{sub max})) and O(nm(C{sub max} + nmin(D{sub max}, C{sub max}))) when the link delays and bandwidths are non-negative integers. Here D{sub max} and C{sub max} are respectively the maximum link delay and maximum link bandwidth and C{sub max} and D{sub max} are greater than zero.

  19. Statistical Network Analysis for Functional MRI: Mean Networks and Group Comparisons.

    Directory of Open Access Journals (Sweden)

    Cedric E Ginestet

    2014-05-01

    Full Text Available Comparing networks in neuroscience is hard, because the topological properties of a given network are necessarily dependent on the number of edges of that network. This problem arises in the analysis of both weighted and unweighted networks. The term density is often used in this context, in order to refer to the mean edge weight of a weighted network, or to the number of edges in an unweighted one. Comparing families of networks is therefore statistically difficult because differences in topology are necessarily associated with differences in density. In this review paper, we consider this problem from two different perspectives, which include (i the construction of summary networks, such as how to compute and visualize the mean network from a sample of network-valued data points; and (ii how to test for topological differences, when two families of networks also exhibit significant differences in density. In the first instance, we show that the issue of summarizing a family of networks can be conducted by either adopting a mass-univariate approach, which produces a statistical parametric network (SPN, or by directly computing the mean network, provided that a metric has been specified on the space of all networks with a given number of nodes. In the second part of this review, we then highlight the inherent problems associated with the comparison of topological functions of families of networks that differ in density. In particular, we show that a wide range of topological summaries, such as global efficiency and network modularity are highly sensitive to differences in density. Moreover, these problems are not restricted to unweighted metrics, as we demonstrate that the same issues remain present when considering the weighted versions of these metrics. We conclude by encouraging caution, when reporting such statistical comparisons, and by emphasizing the importance of constructing summary networks.

  20. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  1. The Poor Man's Guide to Computer Networks and their Applications

    DEFF Research Database (Denmark)

    Sharp, Robin

    2003-01-01

    These notes for DTU course 02220, Concurrent Programming, give an introduction to computer networks, with focus on the modern Internet. Basic Internet protocols such as IP, TCP and UDP are presented, and two Internet application protocols, SMTP and HTTP, are described in some detail. Techniques f...... for network programming are described, with concrete examples in Java. Techniques considered include simple socket programming, RMI, Corba, and Web services with SOAP....

  2. Large Scale Evolution of Convolutional Neural Networks Using Volunteer Computing

    OpenAIRE

    Desell, Travis

    2017-01-01

    This work presents a new algorithm called evolutionary exploration of augmenting convolutional topologies (EXACT), which is capable of evolving the structure of convolutional neural networks (CNNs). EXACT is in part modeled after the neuroevolution of augmenting topologies (NEAT) algorithm, with notable exceptions to allow it to scale to large scale distributed computing environments and evolve networks with convolutional filters. In addition to multithreaded and MPI versions, EXACT has been ...

  3. Efficient Capacity Computation and Power Optimization for Relay Networks

    CERN Document Server

    Parvaresh, Farzad

    2011-01-01

    The capacity or approximations to capacity of various single-source single-destination relay network models has been characterized in terms of the cut-set upper bound. In principle, a direct computation of this bound requires evaluating the cut capacity over exponentially many cuts. We show that the minimum cut capacity of a relay network under some special assumptions can be cast as a minimization of a submodular function, and as a result, can be computed efficiently. We use this result to show that the capacity, or an approximation to the capacity within a constant gap for the Gaussian, wireless erasure, and Avestimehr-Diggavi-Tse deterministic relay network models can be computed in polynomial time. We present some empirical results showing that computing constant-gap approximations to the capacity of Gaussian relay networks with around 300 nodes can be done in order of minutes. For Gaussian networks, cut-set capacities are also functions of the powers assigned to the nodes. We consider a family of power o...

  4. Network Reconstruction and Systems Analysis of Cardiac Myocyte Hypertrophy Signaling*

    Science.gov (United States)

    Ryall, Karen A.; Holland, David O.; Delaney, Kyle A.; Kraeutler, Matthew J.; Parker, Audrey J.; Saucerman, Jeffrey J.

    2012-01-01

    Cardiac hypertrophy is managed by a dense web of signaling pathways with many pathways influencing myocyte growth. A quantitative understanding of the contributions of individual pathways and their interactions is needed to better understand hypertrophy signaling and to develop more effective therapies for heart failure. We developed a computational model of the cardiac myocyte hypertrophy signaling network to determine how the components and network topology lead to differential regulation of transcription factors, gene expression, and myocyte size. Our computational model of the hypertrophy signaling network contains 106 species and 193 reactions, integrating 14 established pathways regulating cardiac myocyte growth. 109 of 114 model predictions were validated using published experimental data testing the effects of receptor activation on transcription factors and myocyte phenotypic outputs. Network motif analysis revealed an enrichment of bifan and biparallel cross-talk motifs. Sensitivity analysis was used to inform clustering of the network into modules and to identify species with the greatest effects on cell growth. Many species influenced hypertrophy, but only a few nodes had large positive or negative influences. Ras, a network hub, had the greatest effect on cell area and influenced more species than any other protein in the network. We validated this model prediction in cultured cardiac myocytes. With this integrative computational model, we identified the most influential species in the cardiac hypertrophy signaling network and demonstrate how different levels of network organization affect myocyte size, transcription factors, and gene expression. PMID:23091058

  5. Activity Recognition Using Complex Network Analysis.

    Science.gov (United States)

    Jalloul, Nahed; Poree, Fabienne; Viardot, Geoffrey; L'Hostis, Phillipe; Carrault, Guy

    2017-10-12

    In this paper, we perform complex network analysis on a connectivity dataset retrieved from a monitoring system in order to classify simple daily activities. The monitoring system is composed of a set of wearable sensing modules positioned on the subject's body and the connectivity data consists of the correlation between each pair of modules. A number of network measures are then computed followed by the application of statistical significance and feature selection methods. These methods were implemented for the purpose of reducing the total number of modules in the monitoring system required to provide accurate activity classification. The obtained results show that an overall accuracy of 84.6% for activity classification is achieved, using a Random Forest (RF) classifier, and when considering a monitoring system composed of only two modules positioned at the Neck and Thigh of the subject's body.

  6. Building Social Networks with Computer Networks: A New Deal for Teaching and Learning.

    Science.gov (United States)

    Thurston, Thomas

    2001-01-01

    Discusses the role of computer technology and Web sites in expanding social networks. Focuses on the New Deal Network using two examples: (1) uniting a Julia C. Lathrop Housing (Chicago, Illinois) resident with a university professor; and (2) saving the Hugo Gellert art murals at the Seward Park Coop Apartments (New York). (CMK)

  7. Modular analysis of gene networks by linear temporal logic.

    Science.gov (United States)

    Ito, Sohei; Ichinose, Takuma; Shimakawa, Masaya; Izumi, Naoko; Hagihara, Shigeki; Yonezaki, Naoki

    2013-03-25

    Despite a lot of advances in biology and genomics, it is still difficult to utilise such valuable knowledge and information to understand and analyse large biological systems due to high computational complexity. In this paper we propose a modular method with which from several small network analyses we analyse a large network by integrating them. This method is based on the qualitative framework proposed by authors in which an analysis of gene networks is reduced to checking satisfiability of linear temporal logic formulae. The problem of linear temporal logic satisfiability checking needs exponential time in the size of a formula. Thus it is difficult to analyse large networks directly in this method since the size of a formula grows linearly to the size of a network. The modular method alleviates this computational difficulty. We show some experimental results and see how we benefit from the modular analysis method.

  8. Analysis of Semantic Networks using Complex Networks Concepts

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2013-01-01

    In this paper we perform a preliminary analysis of semantic networks to determine the most important terms that could be used to optimize a summarization task. In our experiments, we measure how the properties of a semantic network change, when the terms in the network are removed. Our preliminar...... results indicate that this approach provides good results on the semantic network analyzed in this paper....

  9. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  10. COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks

    NARCIS (Netherlands)

    Sie, Rory

    2012-01-01

    Sie, R. L. L. (2012). COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks (Unpublished doctoral dissertation). September, 28, 2012, Open Universiteit in the Netherlands (CELSTEC), Heerlen, The Netherlands.

  11. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  12. A local area computer network expert system framework

    Science.gov (United States)

    Dominy, Robert

    1987-01-01

    Over the past years an expert system called LANES designed to detect and isolate faults in the Goddard-wide Hybrid Local Area Computer Network (LACN) was developed. As a result, the need for developing a more generic LACN fault isolation expert system has become apparent. An object oriented approach was explored to create a set of generic classes, objects, rules, and methods that would be necessary to meet this need. The object classes provide a convenient mechanism for separating high level information from low level network specific information. This approach yeilds a framework which can be applied to different network configurations and be easily expanded to meet new needs.

  13. Test experience on an ultrareliable computer communication network

    Science.gov (United States)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultra-reliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  14. Computer algebra and algebraic analysis

    OpenAIRE

    Castro Jiménez, Francisco Jesús; Lambán Pardo, Laureano (Coordinador); Romero Ibáñez, Ana (Coordinador); Rubio García, Julio (Coordinador)

    2010-01-01

    Este artículo describe algunas aplicaciones del Álgebra Computacional al Análisis Algebraico, también conocido como teoría de D-módulos, es decir, el estudio algebraico de sistemas lineales de ecuaciones en derivadas parciales. Mostramos cómo calcular diferentes objetos e invariantes en teoría de D-módulos, utilizando bases de Groebner para anillos de operadores diferenciales lineales. This paper describes some applications of Computer Algebra to Algebraic Analysis also known as D-module t...

  15. Analytical Computation of the Epidemic Threshold on Temporal Networks

    Directory of Open Access Journals (Sweden)

    Eugenio Valdano

    2015-04-01

    Full Text Available The time variation of contacts in a networked system may fundamentally alter the properties of spreading processes and affect the condition for large-scale propagation, as encoded in the epidemic threshold. Despite the great interest in the problem for the physics, applied mathematics, computer science, and epidemiology communities, a full theoretical understanding is still missing and currently limited to the cases where the time-scale separation holds between spreading and network dynamics or to specific temporal network models. We consider a Markov chain description of the susceptible-infectious-susceptible process on an arbitrary temporal network. By adopting a multilayer perspective, we develop a general analytical derivation of the epidemic threshold in terms of the spectral radius of a matrix that encodes both network structure and disease dynamics. The accuracy of the approach is confirmed on a set of temporal models and empirical networks and against numerical results. In addition, we explore how the threshold changes when varying the overall time of observation of the temporal network, so as to provide insights on the optimal time window for data collection of empirical temporal networked systems. Our framework is of both fundamental and practical interest, as it offers novel understanding of the interplay between temporal networks and spreading dynamics.

  16. Forensic Analysis of Compromised Computers

    Science.gov (United States)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  17. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  18. Identifying failure in a tree network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  19. Regional Computation of TEC Using a Neural Network Model

    Science.gov (United States)

    Leandro, R. F.; Santos, M. C.

    2004-05-01

    One of the main sources of errors of GPS measurements is the ionosphere refraction. As a dispersive medium, the ionosphere allow its influence to be computed by using dual frequency receivers. In the case of single frequency receivers it is necessary to use models that tell us how big the ionospheric refraction is. The GPS broadcast message carries parameters of this model, namely Klobuchar model. Dual frequency receivers allow to estimate the influence of ionosphere in the GPS signal by the computation of TEC (Total Electron Content) values, that have a direct relationship with the magnitude of the delay caused by the ionosphere. One alternative is to create a regional model based on a network of dual frequency receivers. In this case, the regional behaviour of ionosphere is modelled in a way that it is possible to estimate the TEC values into or near this region. This regional model can be based on polynomials, for example. In this work we will present a Neural Network-based model to the regional computation of TEC. The advantage of using a Neural Network is that it is not necessary to have a great knowledge on the behaviour of the modelled surface due to the adaptation capability of neural networks training process, that is an iterative adjust of the synaptic weights in function of residuals, using the training parameters. Therefore, the previous knowledge of the modelled phenomena is important to define what kind of and how many parameters are needed to train the neural network so that reasonable results are obtained from the estimations. We have used data from the GPS tracking network in Brazil, and we have tested the accuracy of the new model to all locations where there is a station, accessing the efficiency of the model everywhere. TEC values were computed for each station of the network. After that the training parameters data set for the test station was formed, with the TEC values of all others (all stations, except the test one). The Neural Network was

  20. Design, Implementation and Optimization of Innovative Internet Access Networks, based on Fog Computing and Software Defined Networking

    OpenAIRE

    Iotti, Nicola

    2017-01-01

    1. DESIGN In this dissertation we introduce a new approach to Internet access networks in public spaces, such as Wi-Fi network commonly known as Hotspot, based on Fog Computing (or Edge Computing), Software Defined Networking (SDN) and the deployment of Virtual Machines (VM) and Linux containers, on the edge of the network. In this vision we deploy specialized network elements, called Fog Nodes, on the edge of the network, able to virtualize the physical infrastructure and expose APIs to e...

  1. Networks and network analysis for defence and security

    CERN Document Server

    Masys, Anthony J

    2014-01-01

    Networks and Network Analysis for Defence and Security discusses relevant theoretical frameworks and applications of network analysis in support of the defence and security domains. This book details real world applications of network analysis to support defence and security. Shocks to regional, national and global systems stemming from natural hazards, acts of armed violence, terrorism and serious and organized crime have significant defence and security implications. Today, nations face an uncertain and complex security landscape in which threats impact/target the physical, social, economic

  2. Multiscale methodology for bone remodelling simulation using coupled finite element and neural network computation.

    Science.gov (United States)

    Hambli, Ridha; Katerchi, Houda; Benhamou, Claude-Laurent

    2011-02-01

    The aim of this paper is to develop a multiscale hierarchical hybrid model based on finite element analysis and neural network computation to link mesoscopic scale (trabecular network level) and macroscopic (whole bone level) to simulate the process of bone remodelling. As whole bone simulation, including the 3D reconstruction of trabecular level bone, is time consuming, finite element calculation is only performed at the macroscopic level, whilst trained neural networks are employed as numerical substitutes for the finite element code needed for the mesoscale prediction. The bone mechanical properties are updated at the macroscopic scale depending on the morphological and mechanical adaptation at the mesoscopic scale computed by the trained neural network. The digital image-based modelling technique using μ-CT and voxel finite element analysis is used to capture volume elements representative of 2 mm³ at the mesoscale level of the femoral head. The input data for the artificial neural network are a set of bone material parameters, boundary conditions and the applied stress. The output data are the updated bone properties and some trabecular bone factors. The current approach is the first model, to our knowledge, that incorporates both finite element analysis and neural network computation to rapidly simulate multilevel bone adaptation.

  3. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems

    Directory of Open Access Journals (Sweden)

    Goncalo Martins

    2016-07-01

    Full Text Available In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications. Many networked control systems employ Time-Triggered (TT architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC to TT networked control systems. The contributions of the paper include (1 the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2 an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems.

  4. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems.

    Science.gov (United States)

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D

    2016-07-25

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems.

  5. Passive Fingerprinting Of Computer Network Reconnaissance Tools

    Science.gov (United States)

    2009-09-01

    Fingerprint Summary When reviewing the different data captures for analysis, it was noted that one of the early default Nmap captures had...INTENTIONALLY LEFT BLANK xi LIST OF TABLES Table 1. Nmap (root) Fingerprint Summary... Fingerprint Summary ..................................51 Table 8. CDX Probable Nmap Scan Summary

  6. Improving a Computer Networks Course Using the Partov Simulation Engine

    Science.gov (United States)

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  7. System/360 Computer Assisted Network Scheduling (CANS) System

    Science.gov (United States)

    Brewer, A. C.

    1972-01-01

    Computer assisted scheduling techniques that produce conflict-free and efficient schedules have been developed and implemented to meet needs of the Manned Space Flight Network. CANS system provides effective management of resources in complex scheduling environment. System is automated resource scheduling, controlling, planning, information storage and retrieval tool.

  8. Fish species recognition using computer vision and a neural network

    NARCIS (Netherlands)

    Storbeck, F.; Daan, B.

    2001-01-01

    A system is described to recognize fish species by computer vision and a neural network program. The vision system measures a number of features of fish as seen by a camera perpendicular to a conveyor belt. The features used here are the widths and heights at various locations along the fish. First

  9. Computing Nash Equilibrium in Wireless Ad Hoc Networks

    DEFF Research Database (Denmark)

    Bulychev, Peter E.; David, Alexandre; Larsen, Kim G.

    2012-01-01

    This paper studies the problem of computing Nash equilibrium in wireless networks modeled by Weighted Timed Automata. Such formalism comes together with a logic that can be used to describe complex features such as timed energy constraints. Our contribution is a method for solving this problem...

  10. High Performance Computing and Networking for Science--Background Paper.

    Science.gov (United States)

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    The Office of Technology Assessment is conducting an assessment of the effects of new information technologies--including high performance computing, data networking, and mass data archiving--on research and development. This paper offers a view of the issues and their implications for current discussions about Federal supercomputer initiatives…

  11. A Three-Dimensional Computational Model of Collagen Network Mechanics

    Science.gov (United States)

    Lee, Byoungkoo; Zhou, Xin; Riching, Kristin; Eliceiri, Kevin W.; Keely, Patricia J.; Guelcher, Scott A.; Weaver, Alissa M.; Jiang, Yi

    2014-01-01

    Extracellular matrix (ECM) strongly influences cellular behaviors, including cell proliferation, adhesion, and particularly migration. In cancer, the rigidity of the stromal collagen environment is thought to control tumor aggressiveness, and collagen alignment has been linked to tumor cell invasion. While the mechanical properties of collagen at both the single fiber scale and the bulk gel scale are quite well studied, how the fiber network responds to local stress or deformation, both structurally and mechanically, is poorly understood. This intermediate scale knowledge is important to understanding cell-ECM interactions and is the focus of this study. We have developed a three-dimensional elastic collagen fiber network model (bead-and-spring model) and studied fiber network behaviors for various biophysical conditions: collagen density, crosslinker strength, crosslinker density, and fiber orientation (random vs. prealigned). We found the best-fit crosslinker parameter values using shear simulation tests in a small strain region. Using this calibrated collagen model, we simulated both shear and tensile tests in a large linear strain region for different network geometry conditions. The results suggest that network geometry is a key determinant of the mechanical properties of the fiber network. We further demonstrated how the fiber network structure and mechanics evolves with a local formation, mimicking the effect of pulling by a pseudopod during cell migration. Our computational fiber network model is a step toward a full biomechanical model of cellular behaviors in various ECM conditions. PMID:25386649

  12. Computer-Supported Modelling of Multi modal Transportation Networks Rationalization

    Directory of Open Access Journals (Sweden)

    Ratko Zelenika

    2007-09-01

    Full Text Available This paper deals with issues of shaping and functioning ofcomputer programs in the modelling and solving of multimoda Itransportation network problems. A methodology of an integrateduse of a programming language for mathematical modellingis defined, as well as spreadsheets for the solving of complexmultimodal transportation network problems. The papercontains a comparison of the partial and integral methods ofsolving multimodal transportation networks. The basic hypothesisset forth in this paper is that the integral method results inbetter multimodal transportation network rationalization effects,whereas a multimodal transportation network modelbased on the integral method, once built, can be used as the basisfor all kinds of transportation problems within multimodaltransport. As opposed to linear transport problems, multimodaltransport network can assume very complex shapes. This papercontains a comparison of the partial and integral approach totransp01tation network solving. In the partial approach, astraightforward model of a transp01tation network, which canbe solved through the use of the Solver computer tool within theExcel spreadsheet inteiface, is quite sufficient. In the solving ofa multimodal transportation problem through the integralmethod, it is necessmy to apply sophisticated mathematicalmodelling programming languages which supp01t the use ofcomplex matrix functions and the processing of a vast amountof variables and limitations. The LINGO programming languageis more abstract than the Excel spreadsheet, and it requiresa certain programming knowledge. The definition andpresentation of a problem logic within Excel, in a manner whichis acceptable to computer software, is an ideal basis for modellingin the LINGO programming language, as well as a fasterand more effective implementation of the mathematical model.This paper provides proof for the fact that it is more rational tosolve the problem of multimodal transportation networks by

  13. Signed Link Analysis in Social Media Networks

    OpenAIRE

    Beigi, Ghazaleh; Tang, Jiliang; Liu, Huan

    2016-01-01

    Numerous real-world relations can be represented by signed networks with positive links (e.g., trust) and negative links (e.g., distrust). Link analysis plays a crucial role in understanding the link formation and can advance various tasks in social network analysis such as link prediction. The majority of existing works on link analysis have focused on unsigned social networks. The existence of negative links determines that properties and principles of signed networks are substantially dist...

  14. Social network analysis in medical education

    OpenAIRE

    Isba, Rachel; Woolf, Katherine; Hanneman, Robert

    2016-01-01

    Content\\ud Humans are fundamentally social beings. The social systems within which we live our lives (families, schools, workplaces, professions, friendship groups) have a significant influence on our health, success and well-being. These groups can be characterised as networks and analysed using social network analysis.\\ud \\ud Social Network Analysis\\ud Social network analysis is a mainly quantitative method for analysing how relationships between individuals form and affect those individual...

  15. Computer network time synchronization the network time protocol on earth and in space

    CERN Document Server

    Mills, David L

    2010-01-01

    Carefully coordinated, reliable, and accurate time synchronization is vital to a wide spectrum of fields-from air and ground traffic control, to buying and selling goods and services, to TV network programming. Ill-gotten time could even lead to the unimaginable and cause DNS caches to expire, leaving the entire Internet to implode on the root servers.Written by the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol on Earth and in Space, Second Edition addresses the technological infrastructure of time dissemination, distrib

  16. Design Criteria For Networked Image Analysis System

    Science.gov (United States)

    Reader, Cliff; Nitteberg, Alan

    1982-01-01

    Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.

  17. Computation emerges from adaptive synchronization of networking neurons.

    Directory of Open Access Journals (Sweden)

    Massimiliano Zanin

    Full Text Available The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain.

  18. Synchronization-based computation through networks of coupled oscillators

    Directory of Open Access Journals (Sweden)

    Daniel eMalagarriga

    2015-08-01

    Full Text Available The mesoscopic activity of the brain is strongly dynamical, while at the sametime exhibiting remarkable computational capabilities. In order to examinehow these two features coexist, here we show that the patterns of synchronizedoscillations displayed by networks of neural mass models, representing cortical columns, can be usedas substrates for Boolean computation. Our results reveal that different logicaloperations can be implemented by the same neural mass network at different timesfollowing the dynamics of the input. The results are reproduced experimentallywith electronic circuits of coupled Chua oscillators, showing the robustness of this kind of computation to the intrinsic noise and parameter mismatch of the oscillators responsible for the functioning of the gates. We also show that theinformation-processing capabilities of coupled oscillations go beyond thesimple juxtaposition of logic gates.

  19. Ptychographic X-ray computed tomography of extended colloidal networks in food emulsions

    DEFF Research Database (Denmark)

    Schou Nielsen, Mikkel; Bøgelund Munk, Merete; Diaz, Ana

    2016-01-01

    of suitable non-destructive 3D imaging techniques with submicron resolution. We present results of quantitative ptychographic X-ray computed tomography applied to a palm kernel oil based oil-in-water emulsion. The measurements were carried out at ambient pressure and temperature. The 3D structure...... of the extended colloidal network of fat globules was obtained with a resolution of around 300 nm. Through image analysis of the network structure, the fat globule size distribution was computed and compared to previous findings. In further support, the reconstructed electron density values were within 4...

  20. Advances in neural networks computational and theoretical issues

    CERN Document Server

    Esposito, Anna; Morabito, Francesco

    2015-01-01

    This book collects research works that exploit neural networks and machine learning techniques from a multidisciplinary perspective. Subjects covered include theoretical, methodological and computational topics which are grouped together into chapters devoted to the discussion of novelties and innovations related to the field of Artificial Neural Networks as well as the use of neural networks for applications, pattern recognition, signal processing, and special topics such as the detection and recognition of multimodal emotional expressions and daily cognitive functions, and  bio-inspired memristor-based networks.  Providing insights into the latest research interest from a pool of international experts coming from different research fields, the volume becomes valuable to all those with any interest in a holistic approach to implement believable, autonomous, adaptive, and context-aware Information Communication Technologies.

  1. Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons.

    Directory of Open Access Journals (Sweden)

    Lars Buesing

    2011-11-01

    Full Text Available The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons.

  2. Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons.

    Science.gov (United States)

    Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang

    2011-11-01

    The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC) sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons.

  3. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  4. A computational method based on CVSS for quantifying the vulnerabilities in computer network

    Directory of Open Access Journals (Sweden)

    Shahriyar Mohammadi

    2014-10-01

    Full Text Available Network vulnerability taxonomy has become increasingly important in the area of information and data exchange not only for its potential use in identification of vulnerabilities but also in their assessment and prioritization. Computer networks play an important role in information and communication infrastructure. However, they are constantly exposed to a variety of vulnerability risks. In their attempts to create secure information exchange systems, scientists have concentrated on understanding the nature and typology of these vulnerabilities. Their efforts aimed at establishing secure networks have led to the development of a variety of methods and techniques for quantifying vulnerability. The objective of the present paper is developing a method based on the second edition of common vulnerability scoring system (CVSS for the quantification of Computer Network vulnerabilities. It is expected that the proposed model will help in the identification and effective management of vulnerabilities by their quantification.

  5. Spectral Analysis Methods of Social Networks

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2017-01-01

    Full Text Available Online social networks (such as Facebook, Twitter, VKontakte, etc. being an important channel for disseminating information are often used to arrange an impact on the social consciousness for various purposes - from advertising products or services to the full-scale information war thereby making them to be a very relevant object of research. The paper reviewed the analysis methods of social networks (primarily, online, based on the spectral theory of graphs. Such methods use the spectrum of the social graph, i.e. a set of eigenvalues of its adjacency matrix, and also the eigenvectors of the adjacency matrix.Described measures of centrality (in particular, centrality based on the eigenvector and PageRank, which reflect a degree of impact one or another user of the social network has. A very popular PageRank measure uses, as a measure of centrality, the graph vertices, the final probabilities of the Markov chain, whose matrix of transition probabilities is calculated on the basis of the adjacency matrix of the social graph. The vector of final probabilities is an eigenvector of the matrix of transition probabilities.Presented a method of dividing the graph vertices into two groups. It is based on maximizing the network modularity by computing the eigenvector of the modularity matrix.Considered a method for detecting bots based on the non-randomness measure of a graph to be computed using the spectral coordinates of vertices - sets of eigenvector components of the adjacency matrix of a social graph.In general, there are a number of algorithms to analyse social networks based on the spectral theory of graphs. These algorithms show very good results, but their disadvantage is the relatively high (albeit polynomial computational complexity for large graphs.At the same time it is obvious that the practical application capacity of the spectral graph theory methods is still underestimated, and it may be used as a basis to develop new methods.The work

  6. Computing and Network Systems Administration, Operations Research, and System Dynamics Modeling: A Proposed Research Framework

    Directory of Open Access Journals (Sweden)

    Michael W. Totaro

    2016-12-01

    Full Text Available Information and computing infrastructures (ICT involve levels of complexity that are highly dynamic in nature. This is due in no small measure to the proliferation of technologies, such as: cloud computing and distributed systems architectures, data mining and multidimensional analysis, and large scale enterprise systems, to name a few. Effective computing and network systems administration is integral to the stability and scalability of these complex software, hardware and communication systems. Systems administration involves the design, analysis, and continuous improvement of the performance or operation of information and computing systems. Additionally, social and administrative responsibilities have become nearly as integral for the systems administrator as are the technical demands that have been imposed for decades. The areas of operations research (OR and system dynamics (SD modeling offer system administrators a rich array of analytical and optimization tools that have been developed from diverse disciplines, which include: industrial, scientific, engineering, economic and financial, to name a few. This paper proposes a research framework by which OR and SD modeling techniques may prove useful to computing and network systems administration, which include: linear programming, network analysis, integer programming, nonlinear optimization, Markov processes, queueing modeling, simulation, decision analysis, heuristic techniques, and system dynamics modeling.

  7. Complex Network Analysis of Brazilian Power Grid

    CERN Document Server

    Martins, Gabriela C; Ribeiro, Fabiano L; Forgerini, Fabricio L

    2016-01-01

    Power Grids and other delivery networks has been attracted some attention by the network literature last decades. Despite the Power Grids dynamics has been controlled by computer systems and human operators, the static features of this type of network can be studied and analyzed. The topology of the Brazilian Power Grid (BPG) was studied in this work. We obtained the spatial structure of the BPG from the ONS (electric systems national operator), consisting of high-voltage transmission lines, generating stations and substations. The local low-voltage substations and local power delivery as well the dynamic features of the network were neglected. We analyze the complex network of the BPG and identify the main topological information, such as the mean degree, the degree distribution, the network size and the clustering coefficient to caracterize the complex network. We also detected the critical locations on the network and, therefore, the more susceptible points to lead to a cascading failure and even to a blac...

  8. A modular architecture for transparent computation in recurrent neural networks.

    Science.gov (United States)

    Carmantini, Giovanni S; Beim Graben, Peter; Desroches, Mathieu; Rodrigues, Serafim

    2017-01-01

    Computation is classically studied in terms of automata, formal languages and algorithms; yet, the relation between neural dynamics and symbolic representations and operations is still unclear in traditional eliminative connectionism. Therefore, we suggest a unique perspective on this central issue, to which we would like to refer as transparent connectionism, by proposing accounts of how symbolic computation can be implemented in neural substrates. In this study we first introduce a new model of dynamics on a symbolic space, the versatile shift, showing that it supports the real-time simulation of a range of automata. We then show that the Gödelization of versatile shifts defines nonlinear dynamical automata, dynamical systems evolving on a vectorial space. Finally, we present a mapping between nonlinear dynamical automata and recurrent artificial neural networks. The mapping defines an architecture characterized by its granular modularity, where data, symbolic operations and their control are not only distinguishable in activation space, but also spatially localizable in the network itself, while maintaining a distributed encoding of symbolic representations. The resulting networks simulate automata in real-time and are programmed directly, in the absence of network training. To discuss the unique characteristics of the architecture and their consequences, we present two examples: (i) the design of a Central Pattern Generator from a finite-state locomotive controller, and (ii) the creation of a network simulating a system of interactive automata that supports the parsing of garden-path sentences as investigated in psycholinguistics experiments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Review On Applications Of Neural Network To Computer Vision

    Science.gov (United States)

    Li, Wei; Nasrabadi, Nasser M.

    1989-03-01

    Neural network models have many potential applications to computer vision due to their parallel structures, learnability, implicit representation of domain knowledge, fault tolerance, and ability of handling statistical data. This paper demonstrates the basic principles, typical models and their applications in this field. Variety of neural models, such as associative memory, multilayer back-propagation perceptron, self-stabilized adaptive resonance network, hierarchical structured neocognitron, high order correlator, network with gating control and other models, can be applied to visual signal recognition, reinforcement, recall, stereo vision, motion, object tracking and other vision processes. Most of the algorithms have been simulated on com-puters. Some have been implemented with special hardware. Some systems use features, such as edges and profiles, of images as the data form for input. Other systems use raw data as input signals to the networks. We will present some novel ideas contained in these approaches and provide a comparison of these methods. Some unsolved problems are mentioned, such as extracting the intrinsic properties of the input information, integrating those low level functions to a high-level cognitive system, achieving invariances and other problems. Perspectives of applications of some human vision models and neural network models are analyzed.

  10. Topological Analysis of Wireless Networks (TAWN)

    Science.gov (United States)

    2016-05-31

    19b. TELEPHONE NUMBER (Include area code) 31-05-2016 FINAL REPORT 12-02-2015 -- 31-05-2016 Topological Analysis of Wireless Networks (TAWN) Robinson...mathematical literature on sheaves that describes how to draw global ( network -wide) inferences from them. Wireless network , local homology, sheaf...topology U U U UU 32 Michael Robinson 202-885-3681 Final Report: May 2016 Topological Analysis of Wireless Networks Principal Investigator: Prof. Michael

  11. Cohesion network analysis of CSCL participation.

    Science.gov (United States)

    Dascalu, Mihai; McNamara, Danielle S; Trausan-Matu, Stefan; Allen, Laura K

    2017-04-13

    The broad use of computer-supported collaborative-learning (CSCL) environments (e.g., instant messenger-chats, forums, blogs in online communities, and massive open online courses) calls for automated tools to support tutors in the time-consuming process of analyzing collaborative conversations. In this article, the authors propose and validate the cohesion network analysis (CNA) model, housed within the ReaderBench platform. CNA, grounded in theories of cohesion, dialogism, and polyphony, is similar to social network analysis (SNA), but it also considers text content and discourse structure and, uniquely, uses automated cohesion indices to generate the underlying discourse representation. Thus, CNA enhances the power of SNA by explicitly considering semantic cohesion while modeling interactions between participants. The primary purpose of this article is to describe CNA analysis and to provide a proof of concept, by using ten chat conversations in which multiple participants debated the advantages of CSCL technologies. Each participant's contributions were human-scored on the basis of their relevance in terms of covering the central concepts of the conversation. SNA metrics, applied to the CNA sociogram, were then used to assess the quality of each member's degree of participation. The results revealed that the CNA indices were strongly correlated to the human evaluations of the conversations. Furthermore, a stepwise regression analysis indicated that the CNA indices collectively predicted 54% of the variance in the human ratings of participation. The results provide promising support for the use of automated computational assessments of collaborative participation and of individuals' degrees of active involvement in CSCL environments.

  12. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  13. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    In modernity, an individual identity was constituted from civil society, while in a globalized network society, human identity, if it develops at all, must grow from communal resistance. A communal resistance to an abstract conceptualized world, where there is no possibility for perception...... in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...

  14. Review Essay: Does Qualitative Network Analysis Exist?

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2007-01-01

    Full Text Available Social network analysis was formed and established in the 1970s as a way of analyzing systems of social relations. In this review the theoretical-methodological standpoint of social network analysis ("structural analysis" is introduced and the different forms of social network analysis are presented. Structural analysis argues that social actors and social relations are embedded in social networks, meaning that action and perception of actors as well as the performance of social relations are influenced by the network structure. Since the 1990s structural analysis has integrated concepts such as agency, discourse and symbolic orientation and in this way structural analysis has opened itself. Since then there has been increasing use of qualitative methods in network analysis. They are used to include the perspective of the analyzed actors, to explore networks, and to understand network dynamics. In the reviewed book, edited by Betina HOLLSTEIN and Florian STRAUS, the twenty predominantly empirically orientated contributions demonstrate the possibilities of combining quantitative and qualitative methods in network analyses in different research fields. In this review we examine how the contributions succeed in applying and developing the structural analysis perspective, and the self-positioning of "qualitative network analysis" is evaluated. URN: urn:nbn:de:0114-fqs0701287

  15. Google matrix analysis of directed networks

    Science.gov (United States)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2015-10-01

    In the past decade modern societies have developed enormous communication and social networks. Their classification and information retrieval processing has become a formidable task for the society. Because of the rapid growth of the World Wide Web, and social and communication networks, new mathematical methods have been invented to characterize the properties of these networks in a more detailed and precise way. Various search engines extensively use such methods. It is highly important to develop new tools to classify and rank a massive amount of network information in a way that is adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency using various examples including the World Wide Web, Wikipedia, software architectures, world trade, social and citation networks, brain neural networks, DNA sequences, and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chains, quantum chaos, and random matrix theory.

  16. Smart photonic networks and computer security for image data

    Science.gov (United States)

    Campello, Jorge; Gill, John T.; Morf, Martin; Flynn, Michael J.

    1998-02-01

    Work reported here is part of a larger project on 'Smart Photonic Networks and Computer Security for Image Data', studying the interactions of coding and security, switching architecture simulations, and basic technologies. Coding and security: coding methods that are appropriate for data security in data fusion networks were investigated. These networks have several characteristics that distinguish them form other currently employed networks, such as Ethernet LANs or the Internet. The most significant characteristics are very high maximum data rates; predominance of image data; narrowcasting - transmission of data form one source to a designated set of receivers; data fusion - combining related data from several sources; simple sensor nodes with limited buffering. These characteristics affect both the lower level network design and the higher level coding methods.Data security encompasses privacy, integrity, reliability, and availability. Privacy, integrity, and reliability can be provided through encryption and coding for error detection and correction. Availability is primarily a network issue; network nodes must be protected against failure or routed around in the case of failure. One of the more promising techniques is the use of 'secret sharing'. We consider this method as a special case of our new space-time code diversity based algorithms for secure communication. These algorithms enable us to exploit parallelism and scalable multiplexing schemes to build photonic network architectures. A number of very high-speed switching and routing architectures and their relationships with very high performance processor architectures were studied. Indications are that routers for very high speed photonic networks can be designed using the very robust and distributed TCP/IP protocol, if suitable processor architecture support is available.

  17. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  18. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis.

    Science.gov (United States)

    Ni, Jianhua; Qian, Tianlu; Xi, Changbai; Rui, Yikang; Wang, Jiechen

    2016-08-18

    The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  19. Network analysis literacy a practical approach to the analysis of networks

    CERN Document Server

    Zweig, Katharina A

    2014-01-01

    Network Analysis Literacy focuses on design principles for network analytics projects. The text enables readers to: pose a defined network analytic question; build a network to answer the question; choose or design the right network analytic methods for a particular purpose, and more.

  20. Social network analysis and dual rover communications

    Science.gov (United States)

    Litaker, Harry L.; Howard, Robert L.

    2013-10-01

    Social network analysis (SNA) refers to the collection of techniques, tools, and methods used in sociometry aiming at the analysis of social networks to investigate decision making, group communication, and the distribution of information. Human factors engineers at the National Aeronautics and Space Administration (NASA) conducted a social network analysis on communication data collected during a 14-day field study operating a dual rover exploration mission to better understand the relationships between certain network groups such as ground control, flight teams, and planetary science. The analysis identified two communication network structures for the continuous communication and Twice-a-Day Communication scenarios as a split network and negotiated network respectfully. The major nodes or groups for the networks' architecture, transmittal status, and information were identified using graphical network mapping, quantitative analysis of subjective impressions, and quantified statistical analysis using Sociometric Statue and Centrality. Post-questionnaire analysis along with interviews revealed advantages and disadvantages of each network structure with team members identifying the need for a more stable continuous communication network, improved robustness of voice loops, and better systems training/capabilities for scientific imagery data and operational data during Twice-a-Day Communications.

  1. Using Social Network Analysis to Assess Mentorship and Collaboration in a Public Health Network.

    Science.gov (United States)

    Petrescu-Prahova, Miruna; Belza, Basia; Leith, Katherine; Allen, Peg; Coe, Norma B; Anderson, Lynda A

    2015-08-20

    Addressing chronic disease burden requires the creation of collaborative networks to promote systemic changes and engage stakeholders. Although many such networks exist, they are rarely assessed with tools that account for their complexity. This study examined the structure of mentorship and collaboration relationships among members of the Healthy Aging Research Network (HAN) using social network analysis (SNA). We invited 97 HAN members and partners to complete an online social network survey that included closed-ended questions about HAN-specific mentorship and collaboration during the previous 12 months. Collaboration was measured by examining the activity of the network on 6 types of products: published articles, in-progress manuscripts, grant applications, tools, research projects, and presentations. We computed network-level measures such as density, number of components, and centralization to assess the cohesiveness of the network. Sixty-three respondents completed the survey (response rate, 65%). Responses, which included information about collaboration with nonrespondents, suggested that 74% of HAN members were connected through mentorship ties and that all 97 members were connected through at least one form of collaboration. Mentorship and collaboration ties were present both within and across boundaries of HAN member organizations. SNA of public health collaborative networks provides understanding about the structure of relationships that are formed as a result of participation in network activities. This approach may offer members and funders a way to assess the impact of such networks that goes beyond simply measuring products and participation at the individual level.

  2. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...... for Information Processing (IFIP) and held in Geneva, Switzerland in August 1998. Since the first HCC conference in 1974, IFIP's Technical Committee 9 has endeavoured to set the agenda for human choices and human actions vis-a-vis computers....

  3. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  4. Numerical analysis mathematics of scientific computing

    CERN Document Server

    Kincaid, David

    2009-01-01

    This book introduces students with diverse backgrounds to various types of mathematical analysis that are commonly needed in scientific computing. The subject of numerical analysis is treated from a mathematical point of view, offering a complete analysis of methods for scientific computing with appropriate motivations and careful proofs. In an engaging and informal style, the authors demonstrate that many computational procedures and intriguing questions of computer science arise from theorems and proofs. Algorithms are presented in pseudocode, so that students can immediately write computer

  5. Multi-objective optimization in computer networks using metaheuristics

    CERN Document Server

    Donoso, Yezid

    2007-01-01

    Metaheuristics are widely used to solve important practical combinatorial optimization problems. Many new multicast applications emerging from the Internet-such as TV over the Internet, radio over the Internet, and multipoint video streaming-require reduced bandwidth consumption, end-to-end delay, and packet loss ratio. It is necessary to design and to provide for these kinds of applications as well as for those resources necessary for functionality. Multi-Objective Optimization in Computer Networks Using Metaheuristics provides a solution to the multi-objective problem in routing computer networks. It analyzes layer 3 (IP), layer 2 (MPLS), and layer 1 (GMPLS and wireless functions). In particular, it assesses basic optimization concepts, as well as several techniques and algorithms for the search of minimals; examines the basic multi-objective optimization concepts and the way to solve them through traditional techniques and through several metaheuristics; and demonstrates how to analytically model the compu...

  6. Advances in neural networks computational intelligence for ICT

    CERN Document Server

    Esposito, Anna; Morabito, Francesco; Pasero, Eros

    2016-01-01

    This carefully edited book is putting emphasis on computational and artificial intelligent methods for learning and their relative applications in robotics, embedded systems, and ICT interfaces for psychological and neurological diseases. The book is a follow-up of the scientific workshop on Neural Networks (WIRN 2015) held in Vietri sul Mare, Italy, from the 20th to the 22nd of May 2015. The workshop, at its 27th edition became a traditional scientific event that brought together scientists from many countries, and several scientific disciplines. Each chapter is an extended version of the original contribution presented at the workshop, and together with the reviewers’ peer revisions it also benefits from the live discussion during the presentation. The content of book is organized in the following sections. 1. Introduction, 2. Machine Learning, 3. Artificial Neural Networks: Algorithms and models, 4. Intelligent Cyberphysical and Embedded System, 5. Computational Intelligence Methods for Biomedical ICT in...

  7. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    their lives in a diversity of social and cultural contexts. In so doing, the book tries to imagine in what kind of networks humans may choose and act based on the knowledge and empirical evidence presented in the papers. The topics covered in the book include: people and their changing values; citizens...... in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...... for Information Processing (IFIP) and held in Geneva, Switzerland in August 1998. Since the first HCC conference in 1974, IFIP's Technical Committee 9 has endeavoured to set the agenda for human choices and human actions vis-a-vis computers....

  8. CONCEPTUAL GENERALIZATION OF STRUCTURAL ORGANIZATION OF COMPUTER NETWORKS MEDICAL SCHOOL

    Directory of Open Access Journals (Sweden)

    O. P. Mintser

    2014-01-01

    Full Text Available The basic principles of the structural organization of computer networks in schools are presented. The questions of universities integration’s in the modern infrastructure of the information society are justified. Details the structural organizations of computer networks are presented. The effectiveness of implementing automated library information systems is shown. The big dynamical growths of technical and personal readiness of students to use virtual educational space are presented. In this regard, universities are required to provide advance information on filling the educational environment of modern virtual university, including multimedia resources for industry professional education programs. Based on information and educational environments virtual representations of universities should be formed distributed resource centers that will avoid duplication of effort on the development of innovative educational technologies, will provide a mutual exchange of results and further development of an open continuous professional education, providing accessibility, modularity and mobility training and retraining specialists.

  9. Applications of Social Network Analysis

    Science.gov (United States)

    Thilagam, P. Santhi

    A social network [2] is a description of the social structure between actors, mostly persons, groups or organizations. It indicates the ways in which they are connected with each other by some relationship such as friendship, kinship, finance exchange etc. In a nutshell, when the person uses already known/unknown people to create new contacts, it forms social networking. The social network is not a new concept rather it can be formed when similar people interact with each other directly or indirectly to perform particular task. Examples of social networks include a friendship networks, collaboration networks, co-authorship networks, and co-employees networks which depict the direct interaction among the people. There are also other forms of social networks, such as entertainment networks, business Networks, citation networks, and hyperlink networks, in which interaction among the people is indirect. Generally, social networks operate on many levels, from families up to the level of nations and assists in improving interactive knowledge sharing, interoperability and collaboration.

  10. Analysis and monitoring design for networks

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.; Flanagan, D.; Rowan, T.; Batsell, S.

    1998-06-01

    The idea of applying experimental design methodologies to develop monitoring systems for computer networks is relatively novel even though it was applied in other areas such as meteorology, seismology, and transportation. One objective of a monitoring system should always be to collect as little data as necessary to be able to monitor specific parameters of the system with respect to assigned targets and objectives. This implies a purposeful monitoring where each piece of data has a reason to be collected and stored for future use. When a computer network system as large and complex as the Internet is the monitoring subject, providing an optimal and parsimonious observing system becomes even more important. Many data collection decisions must be made by the developers of a monitoring system. These decisions include but are not limited to the following: (1) The type data collection hardware and software instruments to be used; (2) How to minimize interruption of regular network activities during data collection; (3) Quantification of the objectives and the formulation of optimality criteria; (4) The placement of data collection hardware and software devices; (5) The amount of data to be collected in a given time period, how large a subset of the available data to collect during the period, the length of the period, and the frequency of data collection; (6) The determination of the data to be collected (for instance, selection of response and explanatory variables); (7) Which data will be retained and how long (i.e., data storage and retention issues); and (8) The cost analysis of experiments. Mathematical statistics, and, in particular, optimal experimental design methods, may be used to address the majority of problems generated by 3--7. In this study, the authors focus their efforts on topics 3--5.

  11. "Us and them": a social network analysis of physicians' professional networks and their attitudes towards EBM.

    Science.gov (United States)

    Mascia, Daniele; Cicchetti, Americo; Damiani, Gianfranco

    2013-10-22

    Extant research suggests that there is a strong social component to Evidence-Based Medicine (EBM) adoption since professional networks amongst physicians are strongly associated with their attitudes towards EBM. Despite this evidence, it is still unknown whether individual attitudes to use scientific evidence in clinical decision-making influence the position that physicians hold in their professional network. This paper explores how physicians' attitudes towards EBM is related to the network position they occupy within healthcare organizations. Data pertain to a sample of Italian physicians, whose professional network relationships, demographics and work-profile characteristics were collected. A social network analysis was performed to capture the structural importance of physicians in the collaboration network by the means of a core-periphery analysis and the computation of network centrality indicators. Then, regression analysis was used to test the association between the network position of individual clinicians and their attitudes towards EBM. Findings documented that the overall network structure is made up of a dense cohesive core of physicians and of less connected clinicians who occupy the periphery. A negative association between the physicians' attitudes towards EBM and the coreness they exhibited in the professional network was also found. Network centrality indicators confirmed these results documenting a negative association between physicians' propensity to use EBM and their structural importance in the professional network. Attitudes that physicians show towards EBM are related to the part (core or periphery) of the professional networks to which they belong as well as to their structural importance. By identifying virtuous attitudes and behaviors of professionals within their organizations, policymakers and executives may avoid marginalization and stimulate integration and continuity of care, both within and across the boundaries of healthcare

  12. Understanding complex interactions using social network analysis.

    Science.gov (United States)

    Pow, Janette; Gayen, Kaberi; Elliott, Lawrie; Raeside, Robert

    2012-10-01

    The aim of this paper is to raise the awareness of social network analysis as a method to facilitate research in nursing research. The application of social network analysis in assessing network properties has allowed greater insight to be gained in many areas including sociology, politics, business organisation and health care. However, the use of social networks in nursing has not received sufficient attention. Review of literature and illustration of the application of the method of social network analysis using research examples. First, the value of social networks will be discussed. Then by using illustrative examples, the value of social network analysis to nursing will be demonstrated. The method of social network analysis is found to give greater insights into social situations involving interactions between individuals and has particular application to the study of interactions between nurses and between nurses and patients and other actors. Social networks are systems in which people interact. Two quantitative techniques help our understanding of these networks. The first is visualisation of the network. The second is centrality. Individuals with high centrality are key communicators in a network. Applying social network analysis to nursing provides a simple method that helps gain an understanding of human interaction and how this might influence various health outcomes. It allows influential individuals (actors) to be identified. Their influence on the formation of social norms and communication can determine the extent to which new interventions or ways of thinking are accepted by a group. Thus, working with key individuals in a network could be critical to the success and sustainability of an intervention. Social network analysis can also help to assess the effectiveness of such interventions for the recipient and the service provider. © 2012 Blackwell Publishing Ltd.

  13. Enhancing the Understanding of Computer Networking Courses through Software Tools

    OpenAIRE

    Dafalla, Z. I.; Balaji, R. D.

    2015-01-01

    Computer networking is an important specialization in Information and Communication Technologies. However imparting the right knowledge to students can be a challenging task due to the fact that there is not enough time to deliver lengthy labs during normal lecture hours. Augmenting the use of physical machines with software tools help the students to learn beyond the limited lab sessions within the environment of higher Institutions of learning throughout the world. The Institutions focus mo...

  14. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    In modernity, an individual identity was constituted from civil society, while in a globalized network society, human identity, if it develops at all, must grow from communal resistance. A communal resistance to an abstract conceptualized world, where there is no possibility for perception...... their lives in a diversity of social and cultural contexts. In so doing, the book tries to imagine in what kind of networks humans may choose and act based on the knowledge and empirical evidence presented in the papers. The topics covered in the book include: people and their changing values; citizens...... in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...

  15. Computer simulation of randomly cross-linked polymer networks

    CERN Document Server

    Williams, T P

    2002-01-01

    In this work, Monte Carlo and Stochastic Dynamics computer simulations of mesoscale model randomly cross-linked networks were undertaken. Task parallel implementations of the lattice Monte Carlo Bond Fluctuation model and Kremer-Grest Stochastic Dynamics bead-spring continuum model were designed and used for this purpose. Lattice and continuum precursor melt systems were prepared and then cross-linked to varying degrees. The resultant networks were used to study structural changes during deformation and relaxation dynamics. The effects of a random network topology featuring a polydisperse distribution of strand lengths and an abundance of pendant chain ends, were qualitatively compared to recent published work. A preliminary investigation into the effects of temperature on the structural and dynamical properties was also undertaken. Structural changes during isotropic swelling and uniaxial deformation, revealed a pronounced non-affine deformation dependant on the degree of cross-linking. Fractal heterogeneiti...

  16. An Optimal Path Computation Architecture for the Cloud-Network on Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Hyunhun Cho

    2015-05-01

    Full Text Available Legacy networks do not open the precise information of the network domain because of scalability, management and commercial reasons, and it is very hard to compute an optimal path to the destination. According to today’s ICT environment change, in order to meet the new network requirements, the concept of software-defined networking (SDN has been developed as a technological alternative to overcome the limitations of the legacy network structure and to introduce innovative concepts. The purpose of this paper is to propose the application that calculates the optimal paths for general data transmission and real-time audio/video transmission, which consist of the major services of the National Research & Education Network (NREN in the SDN environment. The proposed SDN routing computation (SRC application is designed and applied in a multi-domain network for the efficient use of resources, selection of the optimal path between the multi-domains and optimal establishment of end-to-end connections.

  17. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    Science.gov (United States)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  18. A Study of the Impact of Virtualization on the Computer Networks

    OpenAIRE

    Timalsena, Pratik

    2013-01-01

    Virtualization is an imminent sector of the Information and Technology in the peresent world. It is advancing and being popuraly implemented world wide. Computer network is not isolated from the global impact of the virtualization. The virtualization is being deployed on the computer networks in a great extent. In general, virtualization is an inevitable tool for computer networks. This report presents a surfacial idea about the impact of the virtualization on the computer network. The report...

  19. Statistical Analysis of Bus Networks in India

    CERN Document Server

    Chatterjee, Atanu; Ramadurai, Gitakrishnan

    2015-01-01

    Through the past decade the field of network science has established itself as a common ground for the cross-fertilization of exciting inter-disciplinary studies which has motivated researchers to model almost every physical system as an interacting network consisting of nodes and links. Although public transport networks such as airline and railway networks have been extensively studied, the status of bus networks still remains in obscurity. In developing countries like India, where bus networks play an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer some of the basic questions on its evolution, growth, robustness and resiliency. In this paper, we model the bus networks of major Indian cities as graphs in \\textit{L}-space, and evaluate their various statistical properties using concepts from network science. Our analysis reveals a wide spectrum of network topology with the common underlying feature of small-world property. We observe tha...

  20. A Newly Developed Method for Computing Reliability Measures in a Water Supply Network

    Directory of Open Access Journals (Sweden)

    Jacek Malinowski

    2016-01-01

    Full Text Available A reliability model of a water supply network has beens examined. Its main features are: a topology that can be decomposed by the so-called state factorization into a (relativelysmall number of derivative networks, each having a series-parallel structure (1, binary-state components (either operative or failed with given flow capacities (2, a multi-state character of the whole network and its sub-networks - a network state is defined as the maximal flow between a source (sources and a sink (sinks (3, all capacities (component, network, and sub-network have integer values (4. As the network operates, its state changes due to component failures, repairs, and replacements. A newly developed method of computing the inter-state transition intensities has been presented. It is based on the so-called state factorization and series-parallel aggregation. The analysis of these intensities shows that the failure-repair process of the considered system is an asymptotically homogenous Markov process. It is also demonstrated how certain reliability parameters useful for the network maintenance planning can be determined on the basis of the asymptotic intensities. For better understanding of the presented method, an illustrative example is given. (original abstract

  1. Improving Family Forest Knowledge Transfer through Social Network Analysis

    Science.gov (United States)

    Gorczyca, Erika L.; Lyons, Patrick W.; Leahy, Jessica E.; Johnson, Teresa R.; Straub, Crista L.

    2012-01-01

    To better engage Maine's family forest landowners our study used social network analysis: a computational social science method for identifying stakeholders, evaluating models of engagement, and targeting areas for enhanced partnerships. Interviews with researchers associated with a research center were conducted to identify how social network…

  2. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  3. The spread of computer viruses over a reduced scale-free network

    Science.gov (United States)

    Yang, Lu-Xing; Yang, Xiaofan

    2014-02-01

    Due to the high dimensionality of an epidemic model of computer viruses over a general scale-free network, it is difficult to make a close study of its dynamics. In particular, it is extremely difficult, if not impossible, to prove the global stability of its viral equilibrium, if any. To overcome this difficulty, we suggest to simplify a general scale-free network by partitioning all of its nodes into two classes: higher-degree nodes and lower-degree nodes, and then equating the degrees of all higher-degree nodes and all lower-degree nodes, respectively, yielding a reduced scale-free network. We then propose an epidemic model of computer viruses over a reduced scale-free network. A theoretical analysis reveals that the proposed model is bound to have a globally stable viral equilibrium, implying that any attempt to eradicate network viruses would prove unavailing. As a result, the next best thing we can do is to restrain virus prevalence. Based on an analysis of the impact of different model parameters on virus prevalence, some practicable measures are recommended to contain virus spreading. The work in this paper adequately justifies the idea of reduced scale-free networks.

  4. Identification of Conserved Moieties in Metabolic Networks by Graph Theoretical Analysis of Atom Transition Networks

    Science.gov (United States)

    Haraldsdóttir, Hulda S.; Fleming, Ronan M. T.

    2016-01-01

    Conserved moieties are groups of atoms that remain intact in all reactions of a metabolic network. Identification of conserved moieties gives insight into the structure and function of metabolic networks and facilitates metabolic modelling. All moiety conservation relations can be represented as nonnegative integer vectors in the left null space of the stoichiometric matrix corresponding to a biochemical network. Algorithms exist to compute such vectors based only on reaction stoichiometry but their computational complexity has limited their application to relatively small metabolic networks. Moreover, the vectors returned by existing algorithms do not, in general, represent conservation of a specific moiety with a defined atomic structure. Here, we show that identification of conserved moieties requires data on reaction atom mappings in addition to stoichiometry. We present a novel method to identify conserved moieties in metabolic networks by graph theoretical analysis of their underlying atom transition networks. Our method returns the exact group of atoms belonging to each conserved moiety as well as the corresponding vector in the left null space of the stoichiometric matrix. It can be implemented as a pipeline of polynomial time algorithms. Our implementation completes in under five minutes on a metabolic network with more than 4,000 mass balanced reactions. The scalability of the method enables extension of existing applications for moiety conservation relations to genome-scale metabolic networks. We also give examples of new applications made possible by elucidating the atomic structure of conserved moieties. PMID:27870845

  5. Simulated, Emulated, and Physical Investigative Analysis (SEPIA) of networked systems.

    Energy Technology Data Exchange (ETDEWEB)

    Burton, David P.; Van Leeuwen, Brian P.; McDonald, Michael James; Onunkwo, Uzoma A.; Tarman, Thomas David; Urias, Vincent E.

    2009-09-01

    This report describes recent progress made in developing and utilizing hybrid Simulated, Emulated, and Physical Investigative Analysis (SEPIA) environments. Many organizations require advanced tools to analyze their information system's security, reliability, and resilience against cyber attack. Today's security analysis utilize real systems such as computers, network routers and other network equipment, computer emulations (e.g., virtual machines) and simulation models separately to analyze interplay between threats and safeguards. In contrast, this work developed new methods to combine these three approaches to provide integrated hybrid SEPIA environments. Our SEPIA environments enable an analyst to rapidly configure hybrid environments to pass network traffic and perform, from the outside, like real networks. This provides higher fidelity representations of key network nodes while still leveraging the scalability and cost advantages of simulation tools. The result is to rapidly produce large yet relatively low-cost multi-fidelity SEPIA networks of computers and routers that let analysts quickly investigate threats and test protection approaches.

  6. A computational model of hemodynamic parameters in cortical capillary networks.

    Science.gov (United States)

    Safaeian, Navid; Sellier, Mathieu; David, Tim

    2011-02-21

    The analysis of hemodynamic parameters and functional reactivity of cerebral capillaries is still controversial. To assess the hemodynamic parameters in the cortical capillary network, a generic model was created using 2D voronoi tessellation in which each edge represents a capillary segment. This method is capable of creating an appropriate generic model of cerebral capillary network relating to each part of the brain cortex because the geometric model is able to vary the capillary density. The modeling presented here is based on morphometric parameters extracted from physiological data of the human cortex. The pertinent hemodynamic parameters were obtained by numerical simulation based on effective blood viscosity as a function of hematocrit and microvessel diameter, phase separation and plasma skimming effects. The hemodynamic parameters of capillary networks with two different densities (consistent with the variation of the morphometric data in the human cortical capillary network) were analyzed. The results show pertinent hemodynamic parameters for each model. The heterogeneity (coefficient variation) and the mean value of hematocrits, flow rates and velocities of the both network models were specified. The distributions of blood flow throughout the both models seem to confirm the hypothesis in which all capillaries in a cortical network are recruited at rest (normal condition). The results also demonstrate a discrepancy of the network resistance between two models, which are derived from the difference in the number density of capillary segments between the models. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Egocentric social network analysis of pathological gambling.

    Science.gov (United States)

    Meisel, Matthew K; Clifton, Allan D; Mackillop, James; Miller, Joshua D; Campbell, W Keith; Goodie, Adam S

    2013-03-01

    To apply social network analysis (SNA) to investigate whether frequency and severity of gambling problems were associated with different network characteristics among friends, family and co-workers is an innovative way to look at relationships among individuals; the current study was the first, to our knowledge, to apply SNA to gambling behaviors. Egocentric social network analysis was used to characterize formally the relationships between social network characteristics and gambling pathology. Laboratory-based questionnaire and interview administration. Forty frequent gamblers (22 non-pathological gamblers, 18 pathological gamblers) were recruited from the community. The SNA revealed significant social network compositional differences between the two groups: pathological gamblers (PGs) had more gamblers, smokers and drinkers in their social networks than did non-pathological gamblers (NPGs). PGs had more individuals in their network with whom they personally gambled, smoked and drank than those with who were NPG. Network ties were closer to individuals in their networks who gambled, smoked and drank more frequently. Associations between gambling severity and structural network characteristics were not significant. Pathological gambling is associated with compositional but not structural differences in social networks. Pathological gamblers differ from non-pathological gamblers in the number of gamblers, smokers and drinkers in their social networks. Homophily within the networks also indicates that gamblers tend to be closer with other gamblers. This homophily may serve to reinforce addictive behaviors, and may suggest avenues for future study or intervention. © 2012 The Authors, Addiction © 2012 Society for the Study of Addiction.

  8. Using new edges for anomaly detection in computer networks

    Science.gov (United States)

    Neil, Joshua Charles

    2015-05-19

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  9. Using new edges for anomaly detection in computer networks

    Energy Technology Data Exchange (ETDEWEB)

    Neil, Joshua Charles

    2017-07-04

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  10. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  11. Optimal control strategy for a novel computer virus propagation model on scale-free networks

    Science.gov (United States)

    Zhang, Chunming; Huang, Haitao

    2016-06-01

    This paper aims to study the combined impact of reinstalling system and network topology on the spread of computer viruses over the Internet. Based on scale-free network, this paper proposes a novel computer viruses propagation model-SLBOSmodel. A systematic analysis of this new model shows that the virus-free equilibrium is globally asymptotically stable when its spreading threshold is less than one; nevertheless, it is proved that the viral equilibrium is permanent if the spreading threshold is greater than one. Then, the impacts of different model parameters on spreading threshold are analyzed. Next, an optimally controlled SLBOS epidemic model on complex networks is also studied. We prove that there is an optimal control existing for the control problem. Some numerical simulations are finally given to illustrate the main results.

  12. Detecting Distributed Network Traffic Anomaly with Network-Wide Correlation Analysis

    Science.gov (United States)

    Zonglin, Li; Guangmin, Hu; Xingmiao, Yao; Dan, Yang

    2008-12-01

    Distributed network traffic anomaly refers to a traffic abnormal behavior involving many links of a network and caused by the same source (e.g., DDoS attack, worm propagation). The anomaly transiting in a single link might be unnoticeable and hard to detect, while the anomalous aggregation from many links can be prevailing, and does more harm to the networks. Aiming at the similar features of distributed traffic anomaly on many links, this paper proposes a network-wide detection method by performing anomalous correlation analysis of traffic signals' instantaneous parameters. In our method, traffic signals' instantaneous parameters are firstly computed, and their network-wide anomalous space is then extracted via traffic prediction. Finally, an anomaly is detected by a global correlation coefficient of anomalous space. Our evaluation using Abilene traffic traces demonstrates the excellent performance of this approach for distributed traffic anomaly detection.

  13. Detecting Distributed Network Traffic Anomaly with Network-Wide Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Yang Dan

    2008-12-01

    Full Text Available Distributed network traffic anomaly refers to a traffic abnormal behavior involving many links of a network and caused by the same source (e.g., DDoS attack, worm propagation. The anomaly transiting in a single link might be unnoticeable and hard to detect, while the anomalous aggregation from many links can be prevailing, and does more harm to the networks. Aiming at the similar features of distributed traffic anomaly on many links, this paper proposes a network-wide detection method by performing anomalous correlation analysis of traffic signals' instantaneous parameters. In our method, traffic signals' instantaneous parameters are firstly computed, and their network-wide anomalous space is then extracted via traffic prediction. Finally, an anomaly is detected by a global correlation coefficient of anomalous space. Our evaluation using Abilene traffic traces demonstrates the excellent performance of this approach for distributed traffic anomaly detection.

  14. Satellite image analysis using neural networks

    Science.gov (United States)

    Sheldon, Roger A.

    1990-01-01

    The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.

  15. Open Problems in Network-aware Data Management in Exa-scale Computing and Terabit Networking Era

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet; Byna, Surendra

    2011-12-06

    Accessing and managing large amounts of data is a great challenge in collaborative computing environments where resources and users are geographically distributed. Recent advances in network technology led to next-generation high-performance networks, allowing high-bandwidth connectivity. Efficient use of the network infrastructure is necessary in order to address the increasing data and compute requirements of large-scale applications. We discuss several open problems, evaluate emerging trends, and articulate our perspectives in network-aware data management.

  16. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  17. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...... illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources and information, and the exercise of power. The paper then examines some of the methodological challenges of social network analysis and how it can be combined with other...

  18. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  19. Social network analysis and supply chain management

    Directory of Open Access Journals (Sweden)

    Raúl Rodríguez Rodríguez

    2016-01-01

    Full Text Available This paper deals with social network analysis and how it could be integrated within supply chain management from a decision-making point of view. Even though the benefits of using social analysis have are widely accepted at both academic and industry/services context, there is still a lack of solid frameworks that allow decision-makers to connect the usage and obtained results of social network analysis – mainly both information and knowledge flows and derived results- with supply chain management objectives and goals. This paper gives an overview of social network analysis, the main social network analysis metrics, supply chain performance and, finally, it identifies how future frameworks could close the gap and link the results of social network analysis with the supply chain management decision-making processes.

  20. GRETNA: a graph theoretical network analysis toolbox for imaging connectomics

    Directory of Open Access Journals (Sweden)

    Jinhui eWang

    2015-06-01

    Full Text Available Recent studies have suggested that the brain’s structural and functional networks (i.e., connectomics can be constructed by various imaging technologies (e.g., EEG/MEG; structural, diffusion and functional MRI and further characterized by graph theory. Given the huge complexity of network construction, analysis and statistics, toolboxes incorporating these functions are largely lacking. Here, we developed the GRaph thEoreTical Network Analysis (GRETNA toolbox for imaging connectomics. The GRETNA contains several key features as follows: (i an open-source, Matlab-based, cross-platform (Windows and UNIX OS package with a graphical user interface; (ii allowing topological analyses of global and local network properties with parallel computing ability, independent of imaging modality and species; (iii providing flexible manipulations in several key steps during network construction and analysis, which include network node definition, network connectivity processing, network type selection and choice of thresholding procedure; (iv allowing statistical comparisons of global, nodal and connectional network metrics and assessments of relationship between these network metrics and clinical or behavioral variables of interest; and (v including functionality in image preprocessing and network construction based on resting-state functional MRI (R-fMRI data. After applying the GRETNA to a publicly released R-fMRI dataset of 54 healthy young adults, we demonstrated that human brain functional networks exhibit efficient small-world, assortative, hierarchical and modular organizations and possess highly connected hubs and that these findings are robust against different analytical strategies. With these efforts, we anticipate that GRETNA will accelerate imaging connectomics in an easy, quick and flexible manner. GRETNA is freely available on the NITRC website (http://www.nitrc.org/projects/gretna/.

  1. GRETNA: a graph theoretical network analysis toolbox for imaging connectomics.

    Science.gov (United States)

    Wang, Jinhui; Wang, Xindi; Xia, Mingrui; Liao, Xuhong; Evans, Alan; He, Yong

    2015-01-01

    Recent studies have suggested that the brain's structural and functional networks (i.e., connectomics) can be constructed by various imaging technologies (e.g., EEG/MEG; structural, diffusion and functional MRI) and further characterized by graph theory. Given the huge complexity of network construction, analysis and statistics, toolboxes incorporating these functions are largely lacking. Here, we developed the GRaph thEoreTical Network Analysis (GRETNA) toolbox for imaging connectomics. The GRETNA contains several key features as follows: (i) an open-source, Matlab-based, cross-platform (Windows and UNIX OS) package with a graphical user interface (GUI); (ii) allowing topological analyses of global and local network properties with parallel computing ability, independent of imaging modality and species; (iii) providing flexible manipulations in several key steps during network construction and analysis, which include network node definition, network connectivity processing, network type selection and choice of thresholding procedure; (iv) allowing statistical comparisons of global, nodal and connectional network metrics and assessments of relationship between these network metrics and clinical or behavioral variables of interest; and (v) including functionality in image preprocessing and network construction based on resting-state functional MRI (R-fMRI) data. After applying the GRETNA to a publicly released R-fMRI dataset of 54 healthy young adults, we demonstrated that human brain functional networks exhibit efficient small-world, assortative, hierarchical and modular organizations and possess highly connected hubs and that these findings are robust against different analytical strategies. With these efforts, we anticipate that GRETNA will accelerate imaging connectomics in an easy, quick and flexible manner. GRETNA is freely available on the NITRC website.

  2. Line-plane broadcasting in a data communications network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Berg, Jeremy E.; Blocksome, Michael A.; Smith, Brian E.

    2010-06-08

    Methods, apparatus, and products are disclosed for line-plane broadcasting in a data communications network of a parallel computer, the parallel computer comprising a plurality of compute nodes connected together through the network, the network optimized for point to point data communications and characterized by at least a first dimension, a second dimension, and a third dimension, that include: initiating, by a broadcasting compute node, a broadcast operation, including sending a message to all of the compute nodes along an axis of the first dimension for the network; sending, by each compute node along the axis of the first dimension, the message to all of the compute nodes along an axis of the second dimension for the network; and sending, by each compute node along the axis of the second dimension, the message to all of the compute nodes along an axis of the third dimension for the network.

  3. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles [Argonne National Lab. (ANL), Argonne, IL (United States); Bell, Greg [ESnet, Berkeley, CA (United States); Canon, Shane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [ESnet, Berkeley, CA (United States); Dattoria, Vince [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Goodwin, Dave [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Lee, Jason [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hicks, Susan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Holohan, Ed [Argonne National Lab. (ANL), Argonne, IL (United States); Klasky, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lauzon, Carolyn [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Rogers, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Skinner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tierney, Brian [ESnet, Berkeley, CA (United States)

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  4. Computationally efficient analysis procedure for frames with ...

    African Journals Online (AJOL)

    Computationally-efficient analytical procedure that provides high-quality analysis results for two-dimensional skeletal structure with segmented (stepped) and linearly-tapered non-prismatic flexural members has been developed based on the stiffness method of structural analysis. A computer program coded in FORTRAN ...

  5. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  6. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  7. METHODOLOGY OF MATHEMATICAL ANALYSIS IN POWER NETWORK

    OpenAIRE

    Jerzy Szkutnik; Mariusz Kawecki

    2008-01-01

    Power distribution network analysis is taken into account. Based on correlation coefficient authors establish methodology of mathematical analysis useful in finding substations bear responsibility for power stoppage. Also methodology of risk assessment will be carried out.

  8. Emulation of the Active Immune Response in a Computer Network

    Science.gov (United States)

    2009-01-15

    there exist a number of methods connected to processes of optimization intended to solve several problems including immunotherapy and immuno ...researchers and security analysts to respond faster in order to keep up with these attacks. New approaches for network security analysis, reactive and

  9. Recurrent neural networks in computer-based clinical decision support for laryngopathies: an experimental study.

    Science.gov (United States)

    Szkoła, Jarosław; Pancerz, Krzysztof; Warchoł, Jan

    2011-01-01

    The main goal of this paper is to give the basis for creating a computer-based clinical decision support (CDS) system for laryngopathies. One of approaches which can be used in the proposed CDS is based on the speech signal analysis using recurrent neural networks (RNNs). RNNs can be used for pattern recognition in time series data due to their ability of memorizing some information from the past. The Elman networks (ENs) are a classical representative of RNNs. To improve learning ability of ENs, we may modify and combine them with another kind of RNNs, namely, with the Jordan networks. The modified Elman-Jordan networks (EJNs) manifest a faster and more exact achievement of the target pattern. Validation experiments were carried out on speech signals of patients from the control group and with two kinds of laryngopathies.

  10. Measuring Road Network Vulnerability with Sensitivity Analysis

    Science.gov (United States)

    Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin

    2017-01-01

    This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706

  11. Analysis of Classical Encryption Techniques in Cloud Computing

    National Research Council Canada - National Science Library

    Muhammad Yasir Shabir Asif Iqbal Zahid Mahmood Ata Ullah Ghafoor

    2016-01-01

    Cloud computing has become a significant computing model in the IT industry. In this emerging model,computing resources such as software, hardware, networking, and storage can be accessed anywhere in the world on a pay-per-use basis...

  12. Robust flux balance analysis of multiscale biochemical reaction networks.

    Science.gov (United States)

    Sun, Yuekai; Fleming, Ronan M T; Thiele, Ines; Saunders, Michael A

    2013-07-30

    Biological processes such as metabolism, signaling, and macromolecular synthesis can be modeled as large networks of biochemical reactions. Large and comprehensive networks, like integrated networks that represent metabolism and macromolecular synthesis, are inherently multiscale because reaction rates can vary over many orders of magnitude. They require special methods for accurate analysis because naive use of standard optimization systems can produce inaccurate or erroneously infeasible results. We describe techniques enabling off-the-shelf optimization software to compute accurate solutions to the poorly scaled optimization problems arising from flux balance analysis of multiscale biochemical reaction networks. We implement lifting techniques for flux balance analysis within the openCOBRA toolbox and demonstrate our techniques using the first integrated reconstruction of metabolism and macromolecular synthesis for E. coli. Our techniques enable accurate flux balance analysis of multiscale networks using off-the-shelf optimization software. Although we describe lifting techniques in the context of flux balance analysis, our methods can be used to handle a variety of optimization problems arising from analysis of multiscale network reconstructions.

  13. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Chao-Chan Wu; Ching-Bang Yao

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  14. Transcription regulatory networks analysis using CAGE

    KAUST Repository

    Tegnér, Jesper N.

    2009-10-01

    Mapping out cellular networks in general and transcriptional networks in particular has proved to be a bottle-neck hampering our understanding of biological processes. Integrative approaches fusing computational and experimental technologies for decoding transcriptional networks at a high level of resolution is therefore of uttermost importance. Yet, this is challenging since the control of gene expression in eukaryotes is a complex multi-level process influenced by several epigenetic factors and the fine interplay between regulatory proteins and the promoter structure governing the combinatorial regulation of gene expression. In this chapter we review how the CAGE data can be integrated with other measurements such as expression, physical interactions and computational prediction of regulatory motifs, which together can provide a genome-wide picture of eukaryotic transcriptional regulatory networks at a new level of resolution. © 2010 by Pan Stanford Publishing Pte. Ltd. All rights reserved.

  15. Adjustment computations spatial data analysis

    CERN Document Server

    Ghilani, Charles D

    2011-01-01

    the complete guide to adjusting for measurement error-expanded and updated no measurement is ever exact. Adjustment Computations updates a classic, definitive text on surveying with the latest methodologies and tools for analyzing and adjusting errors with a focus on least squares adjustments, the most rigorous methodology available and the one on which accuracy standards for surveys are based. This extensively updated Fifth Edition shares new information on advances in modern software and GNSS-acquired data. Expanded sections offer a greater amount of computable problems and their worked solu

  16. [Forensic evidence-based medicine in computer communication networks].

    Science.gov (United States)

    Qiu, Yun-Liang; Peng, Ming-Qi

    2013-12-01

    As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.

  17. Reducing Computational Overhead of Network Coding with Intrinsic Information Conveying

    DEFF Research Database (Denmark)

    Heide, Janus; Zhang, Qi; Pedersen, Morten V.

    This paper investigated the possibility of intrinsic information conveying in network coding systems. The information is embedded into the coding vector by constructing the vector based on a set of predefined rules. This information can subsequently be retrieved by any receiver. The starting point...... is RLNC (Random Linear Network Coding) and the goal is to reduce the amount of coding operations both at the coding and decoding node, and at the same time remove the need for dedicated signaling messages. In a traditional RLNC system, coding operation takes up significant computational resources and adds...... to the overall energy consumption, which is particular problematic for mobile battery-driven devices. In RLNC coding is performed over a FF (Finite Field). We propose to divide this field into sub fields, and let each sub field signify some information or state. In order to embed the information correctly...

  18. Condor-COPASI: high-throughput computing for biochemical networks

    Directory of Open Access Journals (Sweden)

    Kent Edward

    2012-07-01

    Full Text Available Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage.

  19. Sensitivity analysis of linear programming problem through a recurrent neural network

    Science.gov (United States)

    Das, Raja

    2017-11-01

    In this paper we study the recurrent neural network for solving linear programming problems. To achieve optimality in accuracy and also in computational effort, an algorithm is presented. We investigate the sensitivity analysis of linear programming problem through the neural network. A detailed example is also presented to demonstrate the performance of the recurrent neural network.

  20. Symbolic dynamics and computation in model gene networks.

    Science.gov (United States)

    Edwards, R.; Siegelmann, H. T.; Aziza, K.; Glass, L.

    2001-03-01

    We analyze a class of ordinary differential equations representing a simplified model of a genetic network. In this network, the model genes control the production rates of other genes by a logical function. The dynamics in these equations are represented by a directed graph on an n-dimensional hypercube (n-cube) in which each edge is directed in a unique orientation. The vertices of the n-cube correspond to orthants of state space, and the edges correspond to boundaries between adjacent orthants. The dynamics in these equations can be represented symbolically. Starting from a point on the boundary between neighboring orthants, the equation is integrated until the boundary is crossed for a second time. Each different cycle, corresponding to a different sequence of orthants that are traversed during the integration of the equation always starting on a boundary and ending the first time that same boundary is reached, generates a different letter of the alphabet. A word consists of a sequence of letters corresponding to a possible sequence of orthants that arise from integration of the equation starting and ending on the same boundary. The union of the words defines the language. Letters and words correspond to analytically computable Poincare maps of the equation. This formalism allows us to define bifurcations of chaotic dynamics of the differential equation that correspond to changes in the associated language. Qualitative knowledge about the dynamics found by integrating the equation can be used to help solve the inverse problem of determining the underlying network generating the dynamics. This work places the study of dynamics in genetic networks in a context comprising both nonlinear dynamics and the theory of computation. (c) 2001 American Institute of Physics.

  1. Topology analysis of social networks extracted from literature.

    Science.gov (United States)

    Waumans, Michaël C; Nicodème, Thibaut; Bersini, Hugues

    2015-01-01

    In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author's oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel's story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network's evolution over the course of the story.

  2. Topology analysis of social networks extracted from literature.

    Directory of Open Access Journals (Sweden)

    Michaël C Waumans

    Full Text Available In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author's oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel's story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network's evolution over the course of the story.

  3. On the Optimality of Trust Network Analysis with Subjective Logic

    Directory of Open Access Journals (Sweden)

    PARK, Y.

    2014-08-01

    Full Text Available Building and measuring trust is one of crucial aspects in e-commerce, social networking and computer security. Trust networks are widely used to formalize trust relationships and to conduct formal reasoning of trust values. Diverse trust network analysis methods have been developed so far and one of the most widely used schemes is TNA-SL (Trust Network Analysis with Subjective Logic. Recent papers claimed that TNA-SL always finds the optimal solution by producing the least uncertainty. In this paper, we present some counter-examples, which imply that TNA-SL is not an optimal algorithm. Furthermore, we present a probabilistic algorithm in edge splitting to minimize uncertainty.

  4. Teaching strategies applied to teaching computer networks in Engineering in Telecommunications and Electronics

    Directory of Open Access Journals (Sweden)

    Elio Manuel Castañeda-González

    2016-07-01

    Full Text Available Because of the large impact that today computer networks, their study in related fields such as Telecommunications Engineering and Electronics is presented to the student with great appeal. However, by digging in content, lacking a strong practical component, you can make this interest decreases considerably. This paper proposes the use of teaching strategies and analogies, media and interactive applications that enhance the teaching of discipline networks and encourage their study. It is part of an analysis of how the teaching of the discipline process is performed and then a description of each of these strategies is done with their respective contribution to student learning.

  5. Analysis of Network Topologies Underlying Ethylene Growth Response Kinetics

    Directory of Open Access Journals (Sweden)

    Aaron M. Prescott

    2016-08-01

    Full Text Available Most models for ethylene signaling involve a linear pathway. However, measurements of seedling growth kinetics when ethylene is applied and removed have resulted in more complex network models that include coherent feedforward, negative feedback, and positive feedback motifs. However, the dynamical responses of the proposed networks have not been explored in a quantitative manner. Here, we explore (i whether any of the proposed models are capable of producing growth-response behaviors consistent with experimental observations and (ii what mechanistic roles various parts of the network topologies play in ethylene signaling. To address this, we used computational methods to explore two general network topologies: The first contains a coherent feedforward loop that inhibits growth and a negative feedback from growth onto itself (CFF/NFB. In the second, ethylene promotes the cleavage of EIN2, with the product of the cleavage inhibiting growth and promoting the production of EIN2 through a positive feedback loop (PFB. Since few network parameters for ethylene signaling are known in detail, we used an evolutionary algorithm to explore sets of parameters that produce behaviors similar to experimental growth response kinetics of both wildtype and mutant seedlings. We generated a library of parameter sets by independently running the evolutionary algorithm many times. Both network topologies produce behavior consistent with experimental observations and analysis of the parameter sets allows us to identify important network interactions and parameter constraints. We additionally screened these parameter sets for growth recovery in the presence of sub-saturating ethylene doses, which is an experimentally-observed property that emerges in some of the evolved parameter sets. Finally, we probed simplified networks maintaining key features of the CFF/NFB and PFB topologies. From this, we verified observations drawn from the larger networks about mechanisms

  6. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    Finding a location for a new facility such that the facility attracts the maximal number of customers is a challenging problem. Existing studies either model customers as static sites and thus do not consider customer movement, or they focus on theoretical aspects and do not provide solutions tha...... that adopt different approaches to computing the query. Algorithm AUG uses graph augmentation, and ITE uses iterative road-network partitioning. Empirical studies with real data sets demonstrate that the algorithms are capable of offering high performance in realistic settings....

  7. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  8. NML Computation Algorithms for Tree-Structured Multinomial Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Kontkanen Petri

    2007-01-01

    Full Text Available Typical problems in bioinformatics involve large discrete datasets. Therefore, in order to apply statistical methods in such domains, it is important to develop efficient algorithms suitable for discrete data. The minimum description length (MDL principle is a theoretically well-founded, general framework for performing statistical inference. The mathematical formalization of MDL is based on the normalized maximum likelihood (NML distribution, which has several desirable theoretical properties. In the case of discrete data, straightforward computation of the NML distribution requires exponential time with respect to the sample size, since the definition involves a sum over all the possible data samples of a fixed size. In this paper, we first review some existing algorithms for efficient NML computation in the case of multinomial and naive Bayes model families. Then we proceed by extending these algorithms to more complex, tree-structured Bayesian networks.

  9. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  10. Combining MLP and Using Decision Tree in Order to Detect the Intrusion into Computer Networks

    OpenAIRE

    Saba Sedigh Rad; Alireza Zebarjad

    2013-01-01

    The security of computer networks has an important role in computer systems. The increasing use of computer networks results in penetration and destruction of systems by system operations. So, in order to keep the systems away from these hazards, it is essential to use the intrusion detection system (IDS). This intrusion detection is done in order to detect the illicit use and misuse and to avoid damages to the systems and computer networks by both the external and internal intruders. Intrusi...

  11. Weighted Complex Network Analysis of Pakistan Highways

    Directory of Open Access Journals (Sweden)

    Yasir Tariq Mohmand

    2013-01-01

    Full Text Available The structure and properties of public transportation networks have great implications in urban planning, public policies, and infectious disease control. This study contributes a weighted complex network analysis of travel routes on the national highway network of Pakistan. The network is responsible for handling 75 percent of the road traffic yet is largely inadequate, poor, and unreliable. The highway network displays small world properties and is assortative in nature. Based on the betweenness centrality of the nodes, the most important cities are identified as this could help in identifying the potential congestion points in the network. Keeping in view the strategic location of Pakistan, such a study is of practical importance and could provide opportunities for policy makers to improve the performance of the highway network.

  12. Stability and Hopf Bifurcation in a Delayed SEIRS Worm Model in Computer Network

    Directory of Open Access Journals (Sweden)

    Zizhen Zhang

    2013-01-01

    Full Text Available A delayed SEIRS epidemic model with vertical transmission in computer network is considered. Sufficient conditions for local stability of the positive equilibrium and existence of local Hopf bifurcation are obtained by analyzing distribution of the roots of the associated characteristic equation. Furthermore, the direction of the local Hopf bifurcation and the stability of the bifurcating periodic solutions are determined by using the normal form theory and center manifold theorem. Finally, a numerical example is presented to verify the theoretical analysis.

  13. Predictive structural dynamic network analysis.

    Science.gov (United States)

    Chen, Rong; Herskovits, Edward H

    2015-04-30

    Classifying individuals based on magnetic resonance data is an important task in neuroscience. Existing brain network-based methods to classify subjects analyze data from a cross-sectional study and these methods cannot classify subjects based on longitudinal data. We propose a network-based predictive modeling method to classify subjects based on longitudinal magnetic resonance data. Our method generates a dynamic Bayesian network model for each group which represents complex spatiotemporal interactions among brain regions, and then calculates a score representing that subject's deviation from expected network patterns. This network-derived score, along with other candidate predictors, are used to construct predictive models. We validated the proposed method based on simulated data and the Alzheimer's Disease Neuroimaging Initiative study. For the Alzheimer's Disease Neuroimaging Initiative study, we built a predictive model based on the baseline biomarker characterizing the baseline state and the network-based score which was constructed based on the state transition probability matrix. We found that this combined model achieved 0.86 accuracy, 0.85 sensitivity, and 0.87 specificity. For the Alzheimer's Disease Neuroimaging Initiative study, the model based on the baseline biomarkers achieved 0.77 accuracy. The accuracy of our model is significantly better than the model based on the baseline biomarkers (p-value=0.002). We have presented a method to classify subjects based on structural dynamic network model based scores. This method is of great importance to distinguish subjects based on structural network dynamics and the understanding of the network architecture of brain processes and disorders. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Complex network problems in physics, computer science and biology

    Science.gov (United States)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  15. Reaction network analysis in biochemical signaling pathways

    OpenAIRE

    Martinez-Forero, I. (Iván); Pelaez, A. (Antonio); Villoslada, P. (Pablo)

    2010-01-01

    The aim of this thesis is to improve the understanding of signaling pathways through a theoretical study of chemical reaction networks. The equilibirum solution to the equations derived from chemical networks will be analytically resolved using tools from algebraic geometry. The chapters are organized as follows: 1. An introduction to chemical dynamics in biological systems with a special emphasis on steady state analysis 2. Complete description of the chemical reaction network theor...

  16. Computer applications for engineering/structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zaslawsky, M.; Samaddar, S.K.

    1991-01-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  17. Cluster analysis for computer workload evaluation

    CERN Document Server

    Landau, K

    1976-01-01

    An introduction to computer workload analysis is given, showing its range of application in computer centre management, system and application programming. Cluster methods are discussed which can be used in conjunction with workload data and cluster algorithms are adapted to the specific set problem. Several samples of CDC 7600- accounting-data-collected at CERN, the European Organization for Nuclear Research-underwent a cluster analysis to determine job groups. The conclusions from resource usage of typical job groups in relation to computer workload analysis are discussed. (17 refs).

  18. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  19. Implications of computer networking and the Internet for nurse education.

    Science.gov (United States)

    Ward, R

    1997-06-01

    This paper sets out the history of computer networking and its use in nursing and health care education, and places this in its wider historical and social context. The increasing availability and use of computer networks and the internet are producing a changing climate in education as well as in health care. Moves away from traditional face-to-face teaching with a campus institution to widely distributed interactive multimedia learning will affect the roles of students and teachers. The use of electronic mail, mailing lists and the World Wide Web are specifically considered, along with changes to library and information management skills, research methods, journal publication and the like. Issues about the quality, as well as quantity, of information available, are considered. As more and more organizations and institutions begin to use electronic communication methods, it becomes an increasingly important part of the curriculum at all levels, and may lead to fundamental changes in geographical and professional boundaries. A glossary of terms is provided for those not familiar with the technology, along with the contact details for mailing lists and World Wide Web pages mentioned.

  20. Industrial entrepreneurial network: Structural and functional analysis

    Science.gov (United States)

    Medvedeva, M. A.; Davletbaev, R. H.; Berg, D. B.; Nazarova, J. J.; Parusheva, S. S.

    2016-12-01

    Structure and functioning of two model industrial entrepreneurial networks are investigated in the present paper. One of these networks is forming when implementing an integrated project and consists of eight agents, which interact with each other and external environment. The other one is obtained from the municipal economy and is based on the set of the 12 real business entities. Analysis of the networks is carried out on the basis of the matrix of mutual payments aggregated over the certain time period. The matrix is created by the methods of experimental economics. Social Network Analysis (SNA) methods and instruments were used in the present research. The set of basic structural characteristics was investigated: set of quantitative parameters such as density, diameter, clustering coefficient, different kinds of centrality, and etc. They were compared with the random Bernoulli graphs of the corresponding size and density. Discovered variations of random and entrepreneurial networks structure are explained by the peculiarities of agents functioning in production network. Separately, were identified the closed exchange circuits (cyclically closed contours of graph) forming an autopoietic (self-replicating) network pattern. The purpose of the functional analysis was to identify the contribution of the autopoietic network pattern in its gross product. It was found that the magnitude of this contribution is more than 20%. Such value allows using of the complementary currency in order to stimulate economic activity of network agents.

  1. Optimization of stochastic discrete systems and control on complex networks computational networks

    CERN Document Server

    Lozovanu, Dmitrii

    2014-01-01

    This book presents the latest findings on stochastic dynamic programming models and on solving optimal control problems in networks. It includes the authors' new findings on determining the optimal solution of discrete optimal control problems in networks and on solving game variants of Markov decision problems in the context of computational networks. First, the book studies the finite state space of Markov processes and reviews the existing methods and algorithms for determining the main characteristics in Markov chains, before proposing new approaches based on dynamic programming and combinatorial methods. Chapter two is dedicated to infinite horizon stochastic discrete optimal control models and Markov decision problems with average and expected total discounted optimization criteria, while Chapter three develops a special game-theoretical approach to Markov decision processes and stochastic discrete optimal control problems. In closing, the book's final chapter is devoted to finite horizon stochastic con...

  2. Eye tracking using artificial neural networks for human computer interaction.

    Science.gov (United States)

    Demjén, E; Aboši, V; Tomori, Z

    2011-01-01

    This paper describes an ongoing project that has the aim to develop a low cost application to replace a computer mouse for people with physical impairment. The application is based on an eye tracking algorithm and assumes that the camera and the head position are fixed. Color tracking and template matching methods are used for pupil detection. Calibration is provided by neural networks as well as by parametric interpolation methods. Neural networks use back-propagation for learning and bipolar sigmoid function is chosen as the activation function. The user's eye is scanned with a simple web camera with backlight compensation which is attached to a head fixation device. Neural networks significantly outperform parametric interpolation techniques: 1) the calibration procedure is faster as they require less calibration marks and 2) cursor control is more precise. The system in its current stage of development is able to distinguish regions at least on the level of desktop icons. The main limitation of the proposed method is the lack of head-pose invariance and its relative sensitivity to illumination (especially to incidental pupil reflections).

  3. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    Science.gov (United States)

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  4. Computable Analysis with Applications to Dynamic Systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter)

    2010-01-01

    htmlabstractIn this article we develop a theory of computation for continuous mathematics. The theory is based on earlier developments of computable analysis, especially that of the school of Weihrauch, and is presented as a model of intuitionistic type theory. Every effort has been made to keep the

  5. Computational Fluids Domain Reduction to a Simplified Fluid Network

    Science.gov (United States)

    2012-04-19

    best suited algorithm for a specific task. There are no theoretical guidelines to assist on proper features selection for a specific situation. The...UNCLASSIFIED Kumar, Parvesh, and Siri Krishan Wasan. "Comparative Analysis of k-mean Based Algorithms." IJCSNS International Journal of Computer

  6. High-performance computing and networking as tools for accurate emission computed tomography reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Passeri, A. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); Formiconi, A.R. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); De Cristofaro, M.T.E.R. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); Pupi, A. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); Meldolesi, U. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy)

    1997-04-01

    It is well known that the quantitative potential of emission computed tomography (ECT) relies on the ability to compensate for resolution, attenuation and scatter effects. Reconstruction algorithms which are able to take these effects into account are highly demanding in terms of computing resources. The reported work aimed to investigate the use of a parallel high-performance computing platform for ECT reconstruction taking into account an accurate model of the acquisition of single-photon emission tomographic (SPET) data. An iterative algorithm with an accurate model of the variable system response was ported on the MIMD (Multiple Instruction Multiple Data) parallel architecture of a 64-node Cray T3D massively parallel computer. The system was organized to make it easily accessible even from low-cost PC-based workstations through standard TCP/IP networking. A complete brain study of 30 (64 x 64) slices could be reconstructed from a set of 90 (64 x 64) projections with ten iterations of the conjugate gradients algorithm in 9 s, corresponding to an actual speed-up factor of 135. This work demonstrated the possibility of exploiting remote high-performance computing and networking resources from hospital sites by means of low-cost workstations using standard communication protocols without particular problems for routine use. The achievable speed-up factors allow the assessment of the clinical benefit of advanced reconstruction techniques which require a heavy computational burden for the compensation effects such as variable spatial resolution, scatter and attenuation. The possibility of using the same software on the same hardware platform with data acquired in different laboratories with various kinds of SPET instrumentation is appealing for software quality control and for the evaluation of the clinical impact of the reconstruction methods. (orig.). With 4 figs., 1 tab.

  7. Social Network Analysis and Critical Realism

    DEFF Research Database (Denmark)

    Buch-Hansen, Hubert

    2014-01-01

    Social network analysis ( SNA) is an increasingly popular approach that provides researchers with highly developed tools to map and analyze complexes of social relations. Although a number of network scholars have explicated the assumptions that underpin SNA, the approach has yet to be discussed ...

  8. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  9. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  10. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  11. Computationally efficient measure of topological redundancy of biological and social networks

    Science.gov (United States)

    Albert, Réka; Dasgupta, Bhaskar; Hegde, Rashmi; Sivanathan, Gowri Sangeetha; Gitter, Anthony; Gürsoy, Gamze; Paul, Pradyut; Sontag, Eduardo

    2011-09-01

    It is well known that biological and social interaction networks have a varying degree of redundancy, though a consensus of the precise cause of this is so far lacking. In this paper, we introduce a topological redundancy measure for labeled directed networks that is formal, computationally efficient, and applicable to a variety of directed networks such as cellular signaling, and metabolic and social interaction networks. We demonstrate the computational efficiency of our measure by computing its value and statistical significance on a number of biological and social networks with up to several thousands of nodes and edges. Our results suggest a number of interesting observations: (1) Social networks are more redundant that their biological counterparts, (2) transcriptional networks are less redundant than signaling networks, (3) the topological redundancy of the C. elegans metabolic network is largely due to its inclusion of currency metabolites, and (4) the redundancy of signaling networks is highly (negatively) correlated with the monotonicity of their dynamics.

  12. Electricity market price forecasting by grid computing optimizing artificial neural networks

    OpenAIRE

    Niimura, T.; Ozawa, K.; Sakamoto, N.

    2007-01-01

    This paper presents a grid computing approach to parallel-process a neural network time-series model for forecasting electricity market prices. A grid computing environment introduced in a university computing laboratory provides access to otherwise underused computing resources. The grid computing of the neural network model not only processes several times faster than a single iterative process, but also provides chances of improving forecasting accuracy. Results of numerical tests using re...

  13. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  14. Spectrum-Based and Collaborative Network Topology Analysis and Visualization

    Science.gov (United States)

    Hu, Xianlin

    2013-01-01

    Networks are of significant importance in many application domains, such as World Wide Web and social networks, which often embed rich topological information. Since network topology captures the organization of network nodes and links, studying network topology is very important to network analysis. In this dissertation, we study networks by…

  15. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are adequately...

  16. Complex Network Analysis of Guangzhou Metro

    Directory of Open Access Journals (Sweden)

    Yasir Tariq Mohmand

    2015-11-01

    Full Text Available The structure and properties of public transportation networks can provide suggestions for urban planning and public policies. This study contributes a complex network analysis of the Guangzhou metro. The metro network has 236 kilometers of track and is the 6th busiest metro system of the world. In this paper topological properties of the network are explored. We observed that the network displays small world properties and is assortative in nature. The network possesses a high average degree of 17.5 with a small diameter of 5. Furthermore, we also identified the most important metro stations based on betweenness and closeness centralities. These could help in identifying the probable congestion points in the metro system and provide policy makers with an opportunity to improve the performance of the metro system.

  17. Architecture Analysis of an FPGA-Based Hopfield Neural Network

    Directory of Open Access Journals (Sweden)

    Miguel Angelo de Abreu de Sousa

    2014-01-01

    Full Text Available Interconnections between electronic circuits and neural computation have been a strongly researched topic in the machine learning field in order to approach several practical requirements, including decreasing training and operation times in high performance applications and reducing cost, size, and energy consumption for autonomous or embedded developments. Field programmable gate array (FPGA hardware shows some inherent features typically associated with neural networks, such as, parallel processing, modular executions, and dynamic adaptation, and works on different types of FPGA-based neural networks were presented in recent years. This paper aims to address different aspects of architectural characteristics analysis on a Hopfield Neural Network implemented in FPGA, such as maximum operating frequency and chip-area occupancy according to the network capacity. Also, the FPGA implementation methodology, which does not employ multipliers in the architecture developed for the Hopfield neural model, is presented, in detail.

  18. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  19. Statistical Analysis of Bus Networks in India.

    Science.gov (United States)

    Chatterjee, Atanu; Manohar, Manju; Ramadurai, Gitakrishnan

    2016-01-01

    In this paper, we model the bus networks of six major Indian cities as graphs in L-space, and evaluate their various statistical properties. While airline and railway networks have been extensively studied, a comprehensive study on the structure and growth of bus networks is lacking. In India, where bus transport plays an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer basic questions on its evolution, growth, robustness and resiliency. Although the common feature of small-world property is observed, our analysis reveals a wide spectrum of network topologies arising due to significant variation in the degree-distribution patterns in the networks. We also observe that these networks although, robust and resilient to random attacks are particularly degree-sensitive. Unlike real-world networks, such as Internet, WWW and airline, that are virtual, bus networks are physically constrained. Our findings therefore, throw light on the evolution of such geographically and constrained networks that will help us in designing more efficient bus networks in the future.

  20. Social network analysis for program implementation.

    Science.gov (United States)

    Valente, Thomas W; Palinkas, Lawrence A; Czaja, Sara; Chu, Kar-Hai; Brown, C Hendricks

    2015-01-01

    This paper introduces the use of social network analysis theory and tools for implementation research. The social network perspective is useful for understanding, monitoring, influencing, or evaluating the implementation process when programs, policies, practices, or principles are designed and scaled up or adapted to different settings. We briefly describe common barriers to implementation success and relate them to the social networks of implementation stakeholders. We introduce a few simple measures commonly used in social network analysis and discuss how these measures can be used in program implementation. Using the four stage model of program implementation (exploration, adoption, implementation, and sustainment) proposed by Aarons and colleagues [1] and our experience in developing multi-sector partnerships involving community leaders, organizations, practitioners, and researchers, we show how network measures can be used at each stage to monitor, intervene, and improve the implementation process. Examples are provided to illustrate these concepts. We conclude with expected benefits and challenges associated with this approach.

  1. DIMACS Workshop on Interconnection Networks and Mapping, and Scheduling Parallel Computations

    CERN Document Server

    Rosenberg, Arnold L; Sotteau, Dominique; NSF Science and Technology Center in Discrete Mathematics and Theoretical Computer Science; Interconnection networks and mapping and scheduling parallel computations

    1995-01-01

    The interconnection network is one of the most basic components of a massively parallel computer system. Such systems consist of hundreds or thousands of processors interconnected to work cooperatively on computations. One of the central problems in parallel computing is the task of mapping a collection of processes onto the processors and routing network of a parallel machine. Once this mapping is done, it is critical to schedule computations within and communication among processor from universities and laboratories, as well as practitioners involved in the design, implementation, and application of massively parallel systems. Focusing on interconnection networks of parallel architectures of today and of the near future , the book includes topics such as network topologies,network properties, message routing, network embeddings, network emulation, mappings, and efficient scheduling. inputs for a process are available where and when the process is scheduled to be computed. This book contains the refereed pro...

  2. Multilayer motif analysis of brain networks

    Science.gov (United States)

    Battiston, Federico; Nicosia, Vincenzo; Chavez, Mario; Latora, Vito

    2017-04-01

    In the last decade, network science has shed new light both on the structural (anatomical) and on the functional (correlations in the activity) connectivity among the different areas of the human brain. The analysis of brain networks has made possible to detect the central areas of a neural system and to identify its building blocks by looking at overabundant small subgraphs, known as motifs. However, network analysis of the brain has so far mainly focused on anatomical and functional networks as separate entities. The recently developed mathematical framework of multi-layer networks allows us to perform an analysis of the human brain where the structural and functional layers are considered together. In this work, we describe how to classify the subgraphs of a multiplex network, and we extend the motif analysis to networks with an arbitrary number of layers. We then extract multi-layer motifs in brain networks of healthy subjects by considering networks with two layers, anatomical and functional, respectively, obtained from diffusion and functional magnetic resonance imaging. Results indicate that subgraphs in which the presence of a physical connection between brain areas (links at the structural layer) coexists with a non-trivial positive correlation in their activities are statistically overabundant. Finally, we investigate the existence of a reinforcement mechanism between the two layers by looking at how the probability to find a link in one layer depends on the intensity of the connection in the other one. Showing that functional connectivity is non-trivially constrained by the underlying anatomical network, our work contributes to a better understanding of the interplay between the structure and function in the human brain.

  3. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  4. SNAP: A General Purpose Network Analysis and Graph Mining Library.

    Science.gov (United States)

    Leskovec, Jure; Sosič, Rok

    2016-10-01

    Large networks are becoming a widely used abstraction for studying complex systems in a broad set of disciplines, ranging from social network analysis to molecular biology and neuroscience. Despite an increasing need to analyze and manipulate large networks, only a limited number of tools are available for this task. Here, we describe Stanford Network Analysis Platform (SNAP), a general-purpose, high-performance system that provides easy to use, high-level operations for analysis and manipulation of large networks. We present SNAP functionality, describe its implementational details, and give performance benchmarks. SNAP has been developed for single big-memory machines and it balances the trade-off between maximum performance, compact in-memory graph representation, and the ability to handle dynamic graphs where nodes and edges are being added or removed over time. SNAP can process massive networks with hundreds of millions of nodes and billions of edges. SNAP offers over 140 different graph algorithms that can efficiently manipulate large graphs, calculate structural properties, generate regular and random graphs, and handle attributes and meta-data on nodes and edges. Besides being able to handle large graphs, an additional strength of SNAP is that networks and their attributes are fully dynamic, they can be modified during the computation at low cost. SNAP is provided as an open source library in C++ as well as a module in Python. We also describe the Stanford Large Network Dataset, a set of social and information real-world networks and datasets, which we make publicly available. The collection is a complementary resource to our SNAP software and is widely used for development and benchmarking of graph analytics algorithms.

  5. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    Directory of Open Access Journals (Sweden)

    Lukas Falat

    2016-01-01

    Full Text Available This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  6. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    Science.gov (United States)

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  7. Computational Genetic Regulatory Networks Evolvable, Self-organizing Systems

    CERN Document Server

    Knabe, Johannes F

    2013-01-01

    Genetic Regulatory Networks (GRNs) in biological organisms are primary engines for cells to enact their engagements with environments, via incessant, continually active coupling. In differentiated multicellular organisms, tremendous complexity has arisen in the course of evolution of life on earth. Engineering and science have so far achieved no working system that can compare with this complexity, depth and scope of organization. Abstracting the dynamics of genetic regulatory control to a computational framework in which artificial GRNs in artificial simulated cells differentiate while connected in a changing topology, it is possible to apply Darwinian evolution in silico to study the capacity of such developmental/differentiated GRNs to evolve. In this volume an evolutionary GRN paradigm is investigated for its evolvability and robustness in models of biological clocks, in simple differentiated multicellularity, and in evolving artificial developing 'organisms' which grow and express an ontogeny starting fr...

  8. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    In modernity, an individual identity was constituted from civil society, while in a globalized network society, human identity, if it develops at all, must grow from communal resistance. A communal resistance to an abstract conceptualized world, where there is no possibility for perception...... and experience of power and therefore no possibility for human choice and action, is of utmost importance for the constituting of human choosers and actors. This book therefore sets focus on those human choosers and actors wishing to read and enjoy the papers as they are actually perceiving and experiencing...... for Information Processing (IFIP) and held in Geneva, Switzerland in August 1998. Since the first HCC conference in 1974, IFIP's Technical Committee 9 has endeavoured to set the agenda for human choices and human actions vis-a-vis computers....

  9. Topology Analysis of Social Networks Extracted from Literature

    Science.gov (United States)

    2015-01-01

    In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author’s oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel’s story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network’s evolution over the course of the story. PMID:26039072

  10. Analysis of complex networks from biology to linguistics

    CERN Document Server

    Dehmer, Matthias

    2009-01-01

    Mathematical problems such as graph theory problems are of increasing importance for the analysis of modelling data in biomedical research such as in systems biology, neuronal network modelling etc. This book follows a new approach of including graph theory from a mathematical perspective with specific applications of graph theory in biomedical and computational sciences. The book is written by renowned experts in the field and offers valuable background information for a wide audience.

  11. Dynamic network-based epistasis analysis: Boolean examples

    Directory of Open Access Journals (Sweden)

    Eugenio eAzpeitia

    2011-12-01

    Full Text Available In this review we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the topologies of gene interactions infered. This has been acknowledged in several previous papers and reviews, but here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson (herein, classical epistasis, defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus. Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct gene interaction topologies are hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our review complements previous accounts, not

  12. Hydraulic Analysis of Water Distribution Network Using Shuffled Complex Evolution

    Directory of Open Access Journals (Sweden)

    Naser Moosavian

    2014-01-01

    Full Text Available Hydraulic analysis of water distribution networks is an important problem in civil engineering. A widely used approach in steady-state analysis of water distribution networks is the global gradient algorithm (GGA. However, when the GGA is applied to solve these networks, zero flows cause a computation failure. On the other hand, there are different mathematical formulations for hydraulic analysis under pressure-driven demand and leakage simulation. This paper introduces an optimization model for the hydraulic analysis of water distribution networks using a metaheuristic method called shuffled complex evolution (SCE algorithm. In this method, applying if-then rules in the optimization model is a simple way in handling pressure-driven demand and leakage simulation, and there is no need for an initial solution vector which must be chosen carefully in many other procedures if numerical convergence is to be achieved. The overall results indicate that the proposed method has the capability of handling various pipe networks problems without changing in model or mathematical formulation. Application of SCE in optimization model can lead to accurate solutions in pipes with zero flows. Finally, it can be concluded that the proposed method is a suitable alternative optimizer challenging other methods especially in terms of accuracy.

  13. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  14. Integration of a network aware traffic generation device into a computer network emulation platform

    CSIR Research Space (South Africa)

    Von Solms, S

    2014-07-01

    Full Text Available Flexible, open source network emulation tools can provide network researchers with significant benefits regarding network behaviour and performance. The evaluation of these networks can benefit greatly from the integration of realistic, network...

  15. Trimming of mammalian transcriptional networks using network component analysis

    Directory of Open Access Journals (Sweden)

    Liao James C

    2010-10-01

    Full Text Available Abstract Background Network Component Analysis (NCA has been used to deduce the activities of transcription factors (TFs from gene expression data and the TF-gene binding relationship. However, the TF-gene interaction varies in different environmental conditions and tissues, but such information is rarely available and cannot be predicted simply by motif analysis. Thus, it is beneficial to identify key TF-gene interactions under the experimental condition based on transcriptome data. Such information would be useful in identifying key regulatory pathways and gene markers of TFs in further studies. Results We developed an algorithm to trim network connectivity such that the important regulatory interactions between the TFs and the genes were retained and the regulatory signals were deduced. Theoretical studies demonstrated that the regulatory signals were accurately reconstructed even in the case where only three independent transcriptome datasets were available. At least 80% of the main target genes were correctly predicted in the extreme condition of high noise level and small number of datasets. Our algorithm was tested with transcriptome data taken from mice under rapamycin treatment. The initial network topology from the literature contains 70 TFs, 778 genes, and 1423 edges between the TFs and genes. Our method retained 1074 edges (i.e. 75% of the original edge number and identified 17 TFs as being significantly perturbed under the experimental condition. Twelve of these TFs are involved in MAPK signaling or myeloid leukemia pathways defined in the KEGG database, or are known to physically interact with each other. Additionally, four of these TFs, which are Hif1a, Cebpb, Nfkb1, and Atf1, are known targets of rapamycin. Furthermore, the trimmed network was able to predict Eno1 as an important target of Hif1a; this key interaction could not be detected without trimming the regulatory network. Conclusions The advantage of our new algorithm

  16. Biological neural networks as model systems for designing future parallel processing computers

    Science.gov (United States)

    Ross, Muriel D.

    1991-01-01

    One of the more interesting debates of the present day centers on whether human intelligence can be simulated by computer. The author works under the premise that neurons individually are not smart at all. Rather, they are physical units which are impinged upon continuously by other matter that influences the direction of voltage shifts across the units membranes. It is only the action of a great many neurons, billions in the case of the human nervous system, that intelligent behavior emerges. What is required to understand even the simplest neural system is painstaking analysis, bit by bit, of the architecture and the physiological functioning of its various parts. The biological neural network studied, the vestibular utricular and saccular maculas of the inner ear, are among the most simple of the mammalian neural networks to understand and model. While there is still a long way to go to understand even this most simple neural network in sufficient detail for extrapolation to computers and robots, a start was made. Moreover, the insights obtained and the technologies developed help advance the understanding of the more complex neural networks that underlie human intelligence.

  17. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation

    Science.gov (United States)

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency. PMID:26609303

  18. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.

    Science.gov (United States)

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.

  19. Interactive computer analysis of nuclear backscattering spectra

    Science.gov (United States)

    Saunders, Philip A.; Ziegler, J. F.

    1983-12-01

    A review will be made of a computer-based interactive nuclear backscattering analysis system. Users without computer experience can develop moderate competence with the system after only brief instruction because of the menu-driven organization. Publishable quality figures can be obtained without any computer expertise. Among the quantities which can be displayed over the data are depth scales for any element, element identification, relative concentrations and theoretical spectra. Captions and titling can made from a selection of 30 font styles. Lettering is put on the graphs under joy-stick control such that placement is exact without needing complicated commands.

  20. Traffic Flow Prediction Model for Large-Scale Road Network Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhaosheng Yang

    2014-01-01

    Full Text Available To increase the efficiency and precision of large-scale road network traffic flow prediction, a genetic algorithm-support vector machine (GA-SVM model based on cloud computing is proposed in this paper, which is based on the analysis of the characteristics and defects of genetic algorithm and support vector machine. In cloud computing environment, firstly, SVM parameters are optimized by the parallel genetic algorithm, and then this optimized parallel SVM model is used to predict traffic flow. On the basis of the traffic flow data of Haizhu District in Guangzhou City, the proposed model was verified and compared with the serial GA-SVM model and parallel GA-SVM model based on MPI (message passing interface. The results demonstrate that the parallel GA-SVM model based on cloud computing has higher prediction accuracy, shorter running time, and higher speedup.

  1. Compartmentalization analysis using discrete fracture network models

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, P.R.; Eiben, T.; Dershowitz, W. [Golder Associates, Redmond, VA (United States); Wadleigh, E. [Marathon Oil Co., Midland, TX (United States)

    1997-08-01

    This paper illustrates how Discrete Fracture Network (DFN) technology can serve as a basis for the calculation of reservoir engineering parameters for the development of fractured reservoirs. It describes the development of quantitative techniques for defining the geometry and volume of structurally controlled compartments. These techniques are based on a combination of stochastic geometry, computational geometry, and graph the theory. The parameters addressed are compartment size, matrix block size and tributary drainage volume. The concept of DFN models is explained and methodologies to compute these parameters are demonstrated.

  2. Network graph analysis of category fluency testing.

    Science.gov (United States)

    Lerner, Alan J; Ogrocki, Paula K; Thomas, Peter J

    2009-03-01

    Category fluency is impaired early in Alzheimer disease (AD). Graph theory is a technique to analyze complex relationships in networks. Features of interest in network analysis include the number of nodes and edges, and variables related to their interconnectedness. Other properties important in network analysis are "small world properties" and "scale-free" properties. The small world property (popularized as the so-called "6 degrees of separation") arises when the majority of connections are local, but a number of connections are to distant nodes. Scale-free networks are characterized by the presence of a few nodes with many connections, and many more nodes with fewer connections. To determine if category fluency data can be analyzed using graph theory. To compare normal elderly, mild cognitive impairment (MCI) and AD network graphs, and characterize changes seen with increasing cognitive impairment. Category fluency results ("animals" recorded over 60 s) from normals (n=38), MCI (n=33), and AD (n=40) completing uniform data set evaluations were converted to network graphs of all unique cooccurring neighbors, and compared for network variables. For Normal, MCI and AD, mean clustering coefficients were 0.21, 0.22, 0.30; characteristic path lengths were 3.27, 3.17, and 2.65; small world properties decreased with increasing cognitive impairment, and all graphs showed scale-free properties. Rank correlations of the 25 commonest items ranged from 0.75 to 0.83. Filtering of low-degree nodes in normal and MCI graphs resulted in properties similar to the AD network graph. Network graph analysis is a promising technique for analyzing changes in category fluency. Our technique results in nonrandom graphs consistent with well-characterized properties for these types of graphs.

  3. Characterization of physiological networks in sleep apnea patients using artificial neural networks for Granger causality computation

    Science.gov (United States)

    Cárdenas, Jhon; Orjuela-Cañón, Alvaro D.; Cerquera, Alexander; Ravelo, Antonio

    2017-11-01

    Different studies have used Transfer Entropy (TE) and Granger Causality (GC) computation to quantify interconnection between physiological systems. These methods have disadvantages in parametrization and availability in analytic formulas to evaluate the significance of the results. Other inconvenience is related with the assumptions in the distribution of the models generated from the data. In this document, the authors present a way to measure the causality that connect the Central Nervous System (CNS) and the Cardiac System (CS) in people diagnosed with obstructive sleep apnea syndrome (OSA) before and during treatment with continuous positive air pressure (CPAP). For this purpose, artificial neural networks were used to obtain models for GC computation, based on time series of normalized powers calculated from electrocardiography (EKG) and electroencephalography (EEG) signals recorded in polysomnography (PSG) studies.

  4. Performance Analysis of 3G Communication Network

    Directory of Open Access Journals (Sweden)

    Toni Anwar

    2013-09-01

    Full Text Available In this project, third generation (3G technologies research had been carried out to design and optimization conditions for 3G network. The 3G wireless mobile communication networks are growing at an ever faster rate, and this is likely to continue in the foreseeable future. Some services such as e-mail, web browsing etc allow the transition of the network from circuit switched to packet switched operation, resulting in increased overall network performance. Higher reliability, better coverage and services, higher capacity, mobility management, and wireless multimedia are all parts of the network performance. Throughput and spectral efficiency are fundamental parameters in capacity planning for 3G cellular network deployments. This project investigates also the downlink (DL and uplink (UL throughput and spectral efficiency performance of the standard Universal Mobile Telecommunications system (UMTS system for different scenarios of user and different technologies. Power consumption comparison for different mobile technology is also discussed. The analysis can significantly help system engineers to obtain crucial performance characteristics of 3G network. At the end of the paper, coverage area of 3G from one of the mobile network in Malaysia is presented.

  5. Kinetic analysis of complex metabolic networks

    Energy Technology Data Exchange (ETDEWEB)

    Stephanopoulos, G. [MIT, Cambridge, MA (United States)

    1996-12-31

    A new methodology is presented for the analysis of complex metabolic networks with the goal of metabolite overproduction. The objective is to locate a small number of reaction steps in a network that have maximum impact on network flux amplification and whose rate can also be increased without functional network derangement. This method extends the concepts of Metabolic Control Analysis to groups of reactions and offers the means for calculating group control coefficients as measures of the control exercised by groups of reactions on the overall network fluxes and intracellular metabolite pools. It is further demonstrated that the optimal strategy for the effective increase of network fluxes, while maintaining an uninterrupted supply of intermediate metabolites, is through the coordinated amplification of multiple (as opposed to a single) reaction steps. Satisfying this requirement invokes the concept of the concentration control to coefficient, which emerges as a critical parameter in the identification of feasible enzymatic modifications with maximal impact on the network flux. A case study of aromatic aminoacid production is provided to illustrate these concepts.

  6. Providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Charles J.; Faraj, Daniel A.; Inglett, Todd A.; Ratterman, Joseph D.

    2018-01-30

    Methods, apparatus, and products are disclosed for providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: receiving a network packet in a compute node, the network packet specifying a destination compute node; selecting, in dependence upon the destination compute node, at least one of the links for the compute node along which to forward the network packet toward the destination compute node; and forwarding the network packet along the selected link to the adjacent compute node connected to the compute node through the selected link.

  7. A New Generation of Networks and Computing Models for High Energy Physics in the LHC Era

    Science.gov (United States)

    Newman, H.

    2011-12-01

    Wide area networks of increasing end-to-end capacity and capability are vital for every phase of high energy physicists' work. Our bandwidth usage, and the typical capacity of the major national backbones and intercontinental links used by our field have progressed by a factor of several hundred times over the past decade. With the opening of the LHC era in 2009-10 and the prospects for discoveries in the upcoming LHC run, the outlook is for a continuation or an acceleration of these trends using next generation networks over the next few years. Responding to the need to rapidly distribute and access datasets of tens to hundreds of terabytes drawn from multi-petabyte data stores, high energy physicists working with network engineers and computer scientists are learning to use long range networks effectively on an increasing scale, and aggregate flows reaching the 100 Gbps range have been observed. The progress of the LHC, and the unprecedented ability of the experiments to produce results rapidly using worldwide distributed data processing and analysis has sparked major, emerging changes in the LHC Computing Models, which are moving from the classic hierarchical model designed a decade ago to more agile peer-to-peer-like models that make more effective use of the resources at Tier2 and Tier3 sites located throughout the world. A new requirements working group has gauged the needs of Tier2 centers, and charged the LHCOPN group that runs the network interconnecting the LHC Tierls with designing a new architecture interconnecting the Tier2s. As seen from the perspective of ICFA's Standing Committee on Inter-regional Connectivity (SCIC), the Digital Divide that separates physicists in several regions of the developing world from those in the developed world remains acute, although many countries have made major advances through the rapid installation of modern network infrastructures. A case in point is Africa, where a new round of undersea cables promises to transform

  8. Exploration Knowledge Sharing Networks Using Social Network Analysis Methods

    Directory of Open Access Journals (Sweden)

    Győző Attila Szilágyi

    2017-10-01

    Full Text Available Knowledge sharing within organization is one of the key factor for success. The organization, where knowledge sharing takes place faster and more efficiently, is able to adapt to changes in the market environment more successfully, and as a result, it may obtain a competitive advantage. Knowledge sharing in an organization is carried out through formal and informal human communication contacts during work. This forms a multi-level complex network whose quantitative and topological characteristics largely determine how quickly and to what extent the knowledge travels within organization. The study presents how different networks of knowledge sharing in the organization can be explored by means of network analysis methods through a case study, and which role play the properties of these networks in fast and sufficient spread of knowledge in organizations. The study also demonstrates the practical applications of our research results. Namely, on the basis of knowledge sharing educational strategies can be developed in an organization, and further, competitiveness of an organization may increase due to those strategies’ application.

  9. Using Granular-Evidence-Based Adaptive Networks for Sensitivity Analysis

    OpenAIRE

    Vališevskis, A.

    2002-01-01

    This paper considers the possibility of using adaptive networks for sensitivity analysis. Adaptive network that processes fuzzy granules is described. The adaptive network training algorithm can be used for sensitivity analysis of decision making models. Furthermore, a case study concerning sensitivity analysis is described, which shows in what way the adaptive network can be used for sensitivity analysis.

  10. A Study of Complex Deep Learning Networks on High Performance, Neuromorphic, and Quantum Computers

    Energy Technology Data Exchange (ETDEWEB)

    Potok, Thomas E [ORNL; Schuman, Catherine D [ORNL; Young, Steven R [ORNL; Patton, Robert M [ORNL; Spedalieri, Federico [University of Southern California, Information Sciences Institute; Liu, Jeremy [University of Southern California, Information Sciences Institute; Yao, Ke-Thia [University of Southern California, Information Sciences Institute; Rose, Garrett [University of Tennessee (UT); Chakma, Gangotree [University of Tennessee (UT)

    2016-01-01

    Current Deep Learning models use highly optimized convolutional neural networks (CNN) trained on large graphical processing units (GPU)-based computers with a fairly simple layered network topology, i.e., highly connected layers, without intra-layer connections. Complex topologies have been proposed, but are intractable to train on current systems. Building the topologies of the deep learning network requires hand tuning, and implementing the network in hardware is expensive in both cost and power. In this paper, we evaluate deep learning models using three different computing architectures to address these problems: quantum computing to train complex topologies, high performance computing (HPC) to automatically determine network topology, and neuromorphic computing for a low-power hardware implementation. Due to input size limitations of current quantum computers we use the MNIST dataset for our evaluation. The results show the possibility of using the three architectures in tandem to explore complex deep learning networks that are untrainable using a von Neumann architecture. We show that a quantum computer can find high quality values of intra-layer connections and weights, while yielding a tractable time result as the complexity of the network increases; a high performance computer can find optimal layer-based topologies; and a neuromorphic computer can represent the complex topology and weights derived from the other architectures in low power memristive hardware. This represents a new capability that is not feasible with current von Neumann architecture. It potentially enables the ability to solve very complicated problems unsolvable with current computing technologies.

  11. Green pathways: Metabolic network analysis of plant systems.

    Science.gov (United States)

    Dersch, Lisa Maria; Beckers, Veronique; Wittmann, Christoph

    2016-03-01

    Metabolic engineering of plants with enhanced crop yield and value-added compositional traits is particularly challenging as they probably exhibit the highest metabolic network complexity of all living organisms. Therefore, approaches of plant metabolic network analysis, which can provide systems-level understanding of plant physiology, appear valuable as guidance for plant metabolic engineers. Strongly supported by the sequencing of plant genomes, a number of different experimental and computational methods have emerged in recent years to study plant systems at various levels: from heterotrophic cell cultures to autotrophic entire plants. The present review presents a state-of-the-art toolbox for plant metabolic network analysis. Among the described approaches are different in silico modeling techniques, including flux balance analysis, elementary flux mode analysis and kinetic flux profiling, as well as different variants of experiments with plant systems which use radioactive and stable isotopes to determine in vivo plant metabolic fluxes. The fundamental principles of these techniques, the required data input and the obtained flux information are enriched by technical advices, specific to plants. In addition, pioneering and high-impacting findings of plant metabolic network analysis highlight the potential of the field. Copyright © 2015 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  12. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    Science.gov (United States)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  13. Locating hardware faults in a data communications network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-01-12

    Hardware faults location in a data communications network of a parallel computer. Such a parallel computer includes a plurality of compute nodes and a data communications network that couples the compute nodes for data communications and organizes the compute node as a tree. Locating hardware faults includes identifying a next compute node as a parent node and a root of a parent test tree, identifying for each child compute node of the parent node a child test tree having the child compute node as root, running a same test suite on the parent test tree and each child test tree, and identifying the parent compute node as having a defective link connected from the parent compute node to a child compute node if the test suite fails on the parent test tree and succeeds on all the child test trees.

  14. The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking

    Science.gov (United States)

    Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng

    computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).

  15. The Design and Analysis of Virtual Network Configuration for a Wireless Mobile ATM Network

    Science.gov (United States)

    Bush, Stephen F.

    1999-05-01

    This research concentrates on the design and analysis of an algorithm referred to as Virtual Network Configuration (VNC) which uses predicted future states of a system for faster network configuration and management. VNC is applied to the configuration of a wireless mobile ATM network. VNC is built on techniques from parallel discrete event simulation merged with constraints from real-time systems and applied to mobile ATM configuration and handoff. Configuration in a mobile network is a dynamic and continuous process. Factors such as load, distance, capacity and topology are all constantly changing in a mobile environment. The VNC algorithm anticipates configuration changes and speeds the reconfiguration process by pre-computing and caching results. VNC propagates local prediction results throughout the VNC enhanced system. The Global Positioning System is an enabling technology for the use of VNC in mobile networks because it provides location information and accurate time for each node. This research has resulted in well defined structures for the encapsulation of physical processes within Logical Processes and a generic library for enhancing a system with VNC. Enhancing an existing system with VNC is straight forward assuming the existing physical processes do not have side effects. The benefit of prediction is gained at the cost of additional traffic and processing. This research includes an analysis of VNC and suggestions for optimization of the VNC algorithm and its parameters.

  16. THE BLENDED LEARNING ACCOMPLISHMENT OF COMPUTER AND NETWORK ENGINEERING EXPERTISE PROGRAM IN VOCATIONAL SCHOOLS

    Directory of Open Access Journals (Sweden)

    Aries Alfian Prasetyo

    2016-10-01

    Full Text Available This study aims to (1 describe supporting and inhibiting factors in blended learning implementation for the students of computer and network engineering expertise program and (2 describe the accomplishment level of the implementation. This study is designed as a descriptive study with quantitative approach. The research object is the blended learning implementation in computer and network engineering expertise program in SMK N 1 Baureno Bojonegoro. The research subjects consist of teachers, facilities, materials and applications and students in the blended learning implementation process. The data was collected using observation, surveys and interviews. It was analyzed using percentages and classification analysis. The results reveals that the blended learning has been appropriately implemented. It is proven by the analysis result of supporting and inhibiting factors including facilities, teachers’ skill, materials and applications and blended learning accomplishment. The result is also supported by the description about blended learning activity, the use of facilities, blended learning composition and the impact of implementing blended learning. The weaknesses in the implementation process are the low quantity and quality of personal computers and inadequate internet connection. Teachers and school boards are expected to work collaboratively to solve the problems thus the implementation of blended learning can be maximized.

  17. Structural parameter identifiability analysis for dynamic reaction networks

    DEFF Research Database (Denmark)

    Davidescu, Florin Paul; Jørgensen, Sten Bay

    2008-01-01

    where for a given set of measured variables it is desirable to investigate which parameters may be estimated prior to spending computational effort on the actual estimation. This contribution addresses the structural parameter identifiability problem for the typical case of reaction network models....... The proposed analysis is performed in two phases. The first phase determines the structurally identifiable reaction rates based on reaction network stoichiometry. The second phase assesses the structural parameter identifiability of the specific kinetic rate expressions using a generating series expansion...... method based on Lie derivatives. The proposed systematic two phase methodology is illustrated on a mass action based model for an enzymatically catalyzed reaction pathway network where only a limited set of variables is measured. The methodology clearly pinpoints the structurally identifiable parameters...

  18. Network Intelligence Based on Network State Information for Connected Vehicles Utilizing Fog Computing

    Directory of Open Access Journals (Sweden)

    Seongjin Park

    2017-01-01

    Full Text Available This paper proposes a method to take advantage of fog computing and SDN in the connected vehicle environment, where communication channels are unstable and the topology changes frequently. A controller knows the current state of the network by maintaining the most recent network topology. Of all the information collected by the controller in the mobile environment, node mobility information is particularly important. Thus, we divide nodes into three classes according to their mobility types and use their related attributes to efficiently manage the mobile connections. Our approach utilizes mobility information to reduce control message overhead by adjusting the period of beacon messages and to support efficient failure recovery. One is to recover the connection failures using only mobility information, and the other is to suggest a real-time scheduling algorithm to recover the services for the vehicles that lost connection in the case of a fog server failure. A real-time scheduling method is first described and then evaluated. The results show that our scheme is effective in the connected vehicle environment. We then demonstrate the reduction of control overhead and the connection recovery by using a network simulator. The simulation results show that control message overhead and failure recovery time are decreased by approximately 55% and 5%, respectively.

  19. Energy Research and Development Administration Ad Hoc Computer Networking Group: experimental program

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, I.

    1975-03-19

    The Ad Hoc Computer Networking Group was established to investigate the potential advantages and costs of newer forms of remote resource sharing and computer networking. The areas of research and investigation that are within the scope of the ERDA CNG are described. (GHT)

  20. Toward a Practical Technique to Halt Multiple Virus Outbreaks on Computer Networks

    OpenAIRE

    Hole, Kjell Jørgen

    2012-01-01

    The author analyzes a technique to prevent multiple simultaneous virus epidemics on any vulnerable computer network with inhomogeneous topology. The technique immunizes a small fraction of the computers and utilizes diverse software platforms to halt the virus outbreaks. The halting technique is of practical interest since a network's detailed topology need not be known.

  1. Synchronized Pair Configuration in Virtualization-Based Lab for Learning Computer Networks

    Science.gov (United States)

    Kongcharoen, Chaknarin; Hwang, Wu-Yuin; Ghinea, Gheorghita

    2017-01-01

    More studies are concentrating on using virtualization-based labs to facilitate computer or network learning concepts. Some benefits are lower hardware costs and greater flexibility in reconfiguring computer and network environments. However, few studies have investigated effective mechanisms for using virtualization fully for collaboration.…

  2. Social network analysis of study environment

    Directory of Open Access Journals (Sweden)

    Blaženka Divjak

    2010-06-01

    Full Text Available Student working environment influences student learning and achievement level. In this respect social aspects of students’ formal and non-formal learning play special role in learning environment. The main research problem of this paper is to find out if students' academic performance influences their position in different students' social networks. Further, there is a need to identify other predictors of this position. In the process of problem solving we use the Social Network Analysis (SNA that is based on the data we collected from the students at the Faculty of Organization and Informatics, University of Zagreb. There are two data samples: in the basic sample N=27 and in the extended sample N=52. We collected data on social-demographic position, academic performance, learning and motivation styles, student status (full-time/part-time, attitudes towards individual and teamwork as well as informal cooperation. Afterwards five different networks (exchange of learning materials, teamwork, informal communication, basic and aggregated social network were constructed. These networks were analyzed with different metrics and the most important were betweenness, closeness and degree centrality. The main result is, firstly, that the position in a social network cannot be forecast only by academic success and, secondly, that part-time students tend to form separate groups that are poorly connected with full-time students. In general, position of a student in social networks in study environment can influence student learning as well as her/his future employability and therefore it is worthwhile to be investigated.

  3. Tensor Fusion Network for Multimodal Sentiment Analysis

    OpenAIRE

    Zadeh, Amir; Chen, Minghai; Poria, Soujanya; Cambria, Erik; Morency, Louis-Philippe

    2017-01-01

    Multimodal sentiment analysis is an increasingly popular research area, which extends the conventional language-based definition of sentiment analysis to a multimodal setup where other relevant modalities accompany language. In this paper, we pose the problem of multimodal sentiment analysis as modeling intra-modality and inter-modality dynamics. We introduce a novel model, termed Tensor Fusion Network, which learns both such dynamics end-to-end. The proposed approach is tailored for the vola...

  4. Network analysis of eight industrial symbiosis systems

    Science.gov (United States)

    Zhang, Yan; Zheng, Hongmei; Shi, Han; Yu, Xiangyi; Liu, Gengyuan; Su, Meirong; Li, Yating; Chai, Yingying

    2016-06-01

    Industrial symbiosis is the quintessential characteristic of an eco-industrial park. To divide parks into different types, previous studies mostly focused on qualitative judgments, and failed to use metrics to conduct quantitative research on the internal structural or functional characteristics of a park. To analyze a park's structural attributes, a range of metrics from network analysis have been applied, but few researchers have compared two or more symbioses using multiple metrics. In this study, we used two metrics (density and network degree centralization) to compare the degrees of completeness and dependence of eight diverse but representative industrial symbiosis networks. Through the combination of the two metrics, we divided the networks into three types: weak completeness, and two forms of strong completeness, namely "anchor tenant" mutualism and "equality-oriented" mutualism. The results showed that the networks with a weak degree of completeness were sparse and had few connections among nodes; for "anchor tenant" mutualism, the degree of completeness was relatively high, but the affiliated members were too dependent on core members; and the members in "equality-oriented" mutualism had equal roles, with diverse and flexible symbiotic paths. These results revealed some of the systems' internal structure and how different structures influenced the exchanges of materials, energy, and knowledge among members of a system, thereby providing insights into threats that may destabilize the network. Based on this analysis, we provide examples of the advantages and effectiveness of recent improvement projects in a typical Chinese eco-industrial park (Shandong Lubei).

  5. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... attacks, and attacks launched by insiders. Finally, the perspectives for the application of the analysis techniques are discussed, thereby, coming a small step closer to providing developers with easy- to-use tools for validating the security of networking applications....

  6. A statistical analysis of UK financial networks

    Science.gov (United States)

    Chu, J.; Nadarajah, S.

    2017-04-01

    In recent years, with a growing interest in big or large datasets, there has been a rise in the application of large graphs and networks to financial big data. Much of this research has focused on the construction and analysis of the network structure of stock markets, based on the relationships between stock prices. Motivated by Boginski et al. (2005), who studied the characteristics of a network structure of the US stock market, we construct network graphs of the UK stock market using same method. We fit four distributions to the degree density of the vertices from these graphs, the Pareto I, Fréchet, lognormal, and generalised Pareto distributions, and assess the goodness of fit. Our results show that the degree density of the complements of the market graphs, constructed using a negative threshold value close to zero, can be fitted well with the Fréchet and lognormal distributions.

  7. Visualization and Analysis of Complex Covert Networks

    DEFF Research Database (Denmark)

    Memon, Bisharat

    This report discusses and summarize the results of my work so far in relation to my Ph.D. project entitled "Visualization and Analysis of Complex Covert Networks". The focus of my research is primarily on development of methods and supporting tools for visualization and analysis of networked...... systems that are covert and hence inherently complex. My Ph.D. is positioned within the wider framework of CrimeFighter project. The framework envisions a number of key knowledge management processes that are involved in the workflow, and the toolbox provides supporting tools to assist human end...

  8. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  9. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  10. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  11. In silico Biochemical Reaction Network Analysis (IBRENA): a package for simulation and analysis of reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2008-04-15

    We present In silico Biochemical Reaction Network Analysis (IBRENA), a software package which facilitates multiple functions including cellular reaction network simulation and sensitivity analysis (both forward and adjoint methods), coupled with principal component analysis, singular-value decomposition and model reduction. The software features a graphical user interface that aids simulation and plotting of in silico results. While the primary focus is to aid formulation, testing and reduction of theoretical biochemical reaction networks, the program can also be used for analysis of high-throughput genomic and proteomic data. The software package, manual and examples are available at http://www.eng.buffalo.edu/~neel/ibrena

  12. Finding Multi-step Attacks in Computer Networks using Heuristic Search and Mobile Ambients

    NARCIS (Netherlands)

    Nunes Leal Franqueira, V.

    2009-01-01

    An important aspect of IT security governance is the proactive and continuous identification of possible attacks in computer networks. This is complicated due to the complexity and size of networks, and due to the fact that usually network attacks are performed in several steps. This thesis proposes

  13. Mechanism Aligning (Gateway) Between Operating System Based on Computer Networks in Unocal

    OpenAIRE

    Vera Morina Oktavia Carla; Drs.Ida Ayu Wiastiti, M.KOM Drs.Ida Ayu Wiastiti, M.KOM

    1997-01-01

    Computer network is a system that allows for the sharing of information among users.Use of operating systems in the network settings is an absolute must. Replacement ofthe network operating system should really consider the possibility of impacts.Aligning is one solution in order to connect two different systems.

  14. Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics.

    Science.gov (United States)

    1987-10-01

    include Security Classification) Instrumentation for scientific computing in neural networks, information science, artificial intelligence, and...instrumentation grant to purchase equipment for support of research in neural networks, information science, artificail intellignece , and applied mathematics...in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics Contract AFOSR 86-0282 Principal Investigator: Stephen

  15. S-TSP: a novel routing algorithm for In-network processing of recursive computation in wireless sensor networks

    Science.gov (United States)

    Tang, Tingfang; Guo, Peng; Liu, Xuefeng

    2016-10-01

    In-network processing is an efficient way to reduce the transmission cost in wireless sensor networks (WSNs). The in-network processing of many domain-specific computation tasks in WSNs usually requires to losslessly distribute the computation of the tasks into the sensor nodes, which is however usually not easy. In this paper we are concerned with such kind of tasks whose computation can only be partitioned into recursive computation mode. To distribute the recursive computations into WSNs, it is required to design an appropriate single in-network processing path, along which the intermediate data is forwarded and updated in the WSNs. We address the recursive computation with constant size of computation result, e.g., distributed least square estimation (D-LSE). Finding the optimal in-network processing path to minimize the total transmission cost in WSNs, is a new problem and seldom studied before. To solve it, we propose a novel routing algorithm called as S-TSP, and compare it with some other greedy algorithms. Extensive simulations are conducted, and the results show the good performance of the proposed S-TSP algorithm.

  16. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  17. Interactive Computer Graphics for System Analysis.

    Science.gov (United States)

    1983-12-01

    confusing distortion to the picture. George Washington Univesity has documented this problem [38:37] and provided a solution for FORTRAN users but makes...Analysis. New York: Holt, Rinehart, and Winston, 1974. 21. Moore M.V. and L.H. Nawrocki, The Educational Effectiveness of Graphic Displays for Computer

  18. The propagation approach for computing biochemical reaction networks.

    Science.gov (United States)

    Henzinger, Thomas A; Mateescu, Maria

    2013-01-01

    We introduce propagation models (PMs), a formalism able to express several kinds of equations that describe the behavior of biochemical reaction networks. Furthermore, we introduce the propagation abstract data type (PADT), which separates concerns regarding different numerical algorithms for the transient analysis of biochemical reaction networks from concerns regarding their implementation, thus allowing for portable and efficient solutions. The state of a propagation abstract data type is given by a vector that assigns mass values to a set of nodes, and its next operator propagates mass values through this set of nodes. We propose an approximate implementation of the next operator, based on threshold abstraction, which propagates only "significant" mass values and thus achieves a compromise between efficiency and accuracy. Finally, we give three use cases for propagation models: the chemical master equation (CME), the reaction rate equation (RRE), and a hybrid method that combines these two equations. These three applications use propagation models in order to propagate probabilities and/or expected values and variances of the model's variables.

  19. Algorithm-structured computer arrays and networks architectures and processes for images, percepts, models, information

    CERN Document Server

    Uhr, Leonard

    1984-01-01

    Computer Science and Applied Mathematics: Algorithm-Structured Computer Arrays and Networks: Architectures and Processes for Images, Percepts, Models, Information examines the parallel-array, pipeline, and other network multi-computers.This book describes and explores arrays and networks, those built, being designed, or proposed. The problems of developing higher-level languages for systems and designing algorithm, program, data flow, and computer structure are also discussed. This text likewise describes several sequences of successively more general attempts to combine the power of arrays wi

  20. Computing Nash Equilibrium in Wireless Ad Hoc Networks: A Simulation-Based Approach

    Directory of Open Access Journals (Sweden)

    Peter Bulychev

    2012-02-01

    Full Text Available This paper studies the problem of computing Nash equilibrium in wireless networks modeled by Weighted Timed Automata. Such formalism comes together with a logic that can be used to describe complex features such as timed energy constraints. Our contribution is a method for solving this problem using Statistical Model Checking. The method has been implemented in UPPAAL model checker and has been applied to the analysis of Aloha CSMA/CD and IEEE 802.15.4 CSMA/CA protocols.

  1. A New Model for Capturing the Spread of Computer Viruses on Complex-Networks

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2013-01-01

    Full Text Available Based on complex network, this paper proposes a novel computer virus propagation model which is motivated by the traditional SEIRQ model. A systematic analysis of this new model shows that the virus-free equilibrium is globally asymptotically stable when its basic reproduction is less than one, and the viral equilibrium is globally attractive when the basic reproduction is greater than one. Some numerical simulations are finally given to illustrate the main results, implying that these results are applicable to depict the dynamics of virus propagation.

  2. Honey characterization using computer vision system and artificial neural networks.

    Science.gov (United States)

    Shafiee, Sahameh; Minaei, Saeid; Moghaddam-Charkari, Nasrollah; Barzegar, Mohsen

    2014-09-15

    This paper reports the development of a computer vision system (CVS) for non-destructive characterization of honey based on colour and its correlated chemical attributes including ash content (AC), antioxidant activity (AA), and total phenolic content (TPC). Artificial neural network (ANN) models were applied to transform RGB values of images to CIE L*a*b* colourimetric measurements and to predict AC, TPC and AA from colour features of images. The developed ANN models were able to convert RGB values to CIE L*a*b* colourimetric parameters with low generalization error of 1.01±0.99. In addition, the developed models for prediction of AC, TPC and AA showed high performance based on colour parameters of honey images, as the R(2) values for prediction were 0.99, 0.98, and 0.87, for AC, AA and TPC, respectively. The experimental results show the effectiveness and possibility of applying CVS for non-destructive honey characterization by the industry. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  4. Organizational network analysis for two networks in the Washington State Department of Transportation.

    Science.gov (United States)

    2010-10-01

    Organizational network analysis (ONA) consists of gathering data on information sharing and : connectivity in a group, calculating network measures, creating network maps, and using this : information to analyze and improve the functionality of the g...

  5. Sequential computation of elementary modes and minimal cut sets in genome-scale metabolic networks using alternate integer linear programming

    Energy Technology Data Exchange (ETDEWEB)

    Song, Hyun-Seob; Goldberg, Noam; Mahajan, Ashutosh; Ramkrishna, Doraiswami

    2017-03-27

    Elementary (flux) modes (EMs) have served as a valuable tool for investigating structural and functional properties of metabolic networks. Identification of the full set of EMs in genome-scale networks remains challenging due to combinatorial explosion of EMs in complex networks. It is often, however, that only a small subset of relevant EMs needs to be known, for which optimization-based sequential computation is a useful alternative. Most of the currently available methods along this line are based on the iterative use of mixed integer linear programming (MILP), the effectiveness of which significantly deteriorates as the number of iterations builds up. To alleviate the computational burden associated with the MILP implementation, we here present a novel optimization algorithm termed alternate integer linear programming (AILP). Results: Our algorithm was designed to iteratively solve a pair of integer programming (IP) and linear programming (LP) to compute EMs in a sequential manner. In each step, the IP identifies a minimal subset of reactions, the deletion of which disables all previously identified EMs. Thus, a subsequent LP solution subject to this reaction deletion constraint becomes a distinct EM. In cases where no feasible LP solution is available, IP-derived reaction deletion sets represent minimal cut sets (MCSs). Despite the additional computation of MCSs, AILP achieved significant time reduction in computing EMs by orders of magnitude. The proposed AILP algorithm not only offers a computational advantage in the EM analysis of genome-scale networks, but also improves the understanding of the linkage between EMs and MCSs.

  6. Computational approach in estimating the need of ditch network maintenance

    Science.gov (United States)

    Lauren, Ari; Hökkä, Hannu; Launiainen, Samuli; Palviainen, Marjo; Repo, Tapani; Leena, Finer; Piirainen, Sirpa

    2015-04-01

    Ditch network maintenance (DNM), implemented annually in 70 000 ha area in Finland, is the most controversial of all forest management practices. Nationwide, it is estimated to increase the forest growth by 1…3 million m3 per year, but simultaneously to cause 65 000 tons export of suspended solids and 71 tons of phosphorus (P) to water courses. A systematic approach that allows simultaneous quantification of the positive and negative effects of DNM is required. Excess water in the rooting zone slows the gas exchange and decreases biological activity interfering with the forest growth in boreal forested peatlands. DNM is needed when: 1) the excess water in the rooting zone restricts the forest growth before the DNM, and 2) after the DNM the growth restriction ceases or decreases, and 3) the benefits of DNM are greater than the caused adverse effects. Aeration in the rooting zone can be used as a drainage criterion. Aeration is affected by several factors such as meteorological conditions, tree stand properties, hydraulic properties of peat, ditch depth, and ditch spacing. We developed a 2-dimensional DNM simulator that allows the user to adjust these factors and to evaluate their effect on the soil aeration at different distance from the drainage ditch. DNM simulator computes hydrological processes and soil aeration along a water flowpath between two ditches. Applying daily time step it calculates evapotranspiration, snow accumulation and melt, infiltration, soil water storage, ground water level, soil water content, air-filled porosity and runoff. The model performance in hydrology has been tested against independent high frequency field monitoring data. Soil aeration at different distance from the ditch is computed under steady-state assumption using an empirical oxygen consumption model, simulated air-filled porosity, and diffusion coefficient at different depths in soil. Aeration is adequate and forest growth rate is not limited by poor aeration if the

  7. Computational modeling of the quorum-sensing network in bacteria

    Science.gov (United States)

    Fenley, Andrew; Banik, Suman; Kulkarni, Rahul

    2007-03-01

    Certain species of bacteria are able produce and sense the concentration of small molecules called autodinducers in order to coordinate gene regulation in response to population density, a process known as ``quorum-sensing''. The resulting regulation of gene expression involves both transcriptional and post-transcriptional regulators. In particular, the species of bacteria in the Vibrio genus use small RNAs to regulate the master protein controlling the quorum-sensing response (luminescence, biofilm formation, virulence...). We model the network of interactions using a modular approach which provides a quantitative understanding of how signal transduction occurs. The parameters of the input-module are fit to current experimental results allowing for testable predictions to be made for future experiments. The results of our analysis offer a revised perspective on quorum-sensing based regulation.

  8. Multifractal analysis of mobile social networks

    Science.gov (United States)

    Zheng, Wei; Zhang, Zifeng; Deng, Yufan

    2017-09-01

    As Wireless Fidelity (Wi-Fi)-enabled handheld devices have been widely used, the mobile social networks (MSNs) has been attracting extensive attention. Fractal approaches have also been widely applied to characterierize natural networks as useful tools to depict their spatial distribution and scaling properties. Moreover, when the complexity of the spatial distribution of MSNs cannot be properly charaterized by single fractal dimension, multifractal analysis is required. For further research, we introduced a multifractal analysis method based on box-covering algorithm to describe the structure of MSNs. Using this method, we find that the networks are multifractal at different time interval. The simulation results demonstrate that the proposed method is efficient for analyzing the multifractal characteristic of MSNs, which provides a distribution of singularities adequately describing both the heterogeneity of fractal patterns and the statistics of measurements across spatial scales in MSNs.

  9. Bandwidth Analysis of Smart Meter Network Infrastructure

    DEFF Research Database (Denmark)

    Balachandran, Kardi; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2014-01-01

    Advanced Metering Infrastructure (AMI) is a net-work infrastructure in Smart Grid, which links the electricity customers to the utility company. This network enables smart services by making it possible for the utility company to get an overview of their customers power consumption and also control...... to utilize smart meters and which existing broadband network technologies can facilitate this smart meter service. Initially, scenarios for smart meter infrastructure are identified. The paper defines abstraction models which cover the AMI scenarios. When the scenario has been identified a general overview...... of the bandwidth requirements are analysed. For this analysis the assumptions and limitations are defined. The results obtained by the analysis show, that the amount of data collected and transferred by a smart meter is very low compared to the available bandwidth of most internet connections. The results show...

  10. Automatic detection of emerging threats to computer networks

    CSIR Research Space (South Africa)

    McDonald, A

    2015-10-01

    Full Text Available intrusion detection technology is to detect threats to networked information systems and networking infrastructure in an automated fashion, thereby providing an opportunity to deploy countermeasures. This presentation showcases the research and development...

  11. Use of medical information by computer networks raises major concerns about privacy.

    OpenAIRE

    OReilly, M.

    1995-01-01

    The development of computer data-bases and long-distance computer networks is leading to improvements in Canada's health care system. However, these developments come at a cost and require a balancing act between access and confidentiality. Columnist Michael OReilly, who in this article explores the security of computer networks, notes that respect for patients' privacy must be given as high a priority as the ability to see their records in the first place.

  12. Calculating a checksum with inactive networking components in a computing system

    Science.gov (United States)

    Aho, Michael E; Chen, Dong; Eisley, Noel A; Gooding, Thomas M; Heidelberger, Philip; Tauferner, Andrew T

    2015-01-27

    Calculating a checksum utilizing inactive networking components in a computing system, including: identifying, by a checksum distribution manager, an inactive networking component, wherein the inactive networking component includes a checksum calculation engine for computing a checksum; sending, to the inactive networking component by the checksum distribution manager, metadata describing a block of data to be transmitted by an active networking component; calculating, by the inactive networking component, a checksum for the block of data; transmitting, to the checksum distribution manager from the inactive networking component, the checksum for the block of data; and sending, by the active networking component, a data communications message that includes the block of data and the checksum for the block of data.

  13. Computational intelligent data analysis for sustainable development computational intelligent data analysis for sustainable development

    CERN Document Server

    Yu, Ting; Simoff, Simeon

    2016-01-01

    Computational Intelligent Data Analysis for Sustainable Development: An Introduction and Overview Ting Yu, Nitesh Chawla, and Simeon SimoffIntegrated Sustainability AnalysisTracing Embodied CO2 in Trade Using High-Resolution Input-Output Tables Daniel Moran and Arne GeschkeAggregation Effects in Carbon Footprint Accounting Using Multi-Region Input-Output Analysis Xin Zhou, Hiroaki Shirakawa, and Manfred LenzenComputational Intelligent Data Analysis for Climate ChangeClimate InformaticsClaire Monteleoni, Gavin A. Schmidt, Francis Alexander, Alexandru Niculescu-Mizil, Karsten Steinhaeuser, Michael Tippett, Arindam Banerjee, M. Benno Blumenthal, Auroop R. Ganguly, Jason E. Smerdon, and Marco TedescoComputational Data Sciences for Actionable Insights on Climate Extremes and Uncertainty Auroop R. Ganguly, Evan Kodra, Snigdhansu Chatterjee, Arindam Banerjee, and Habib N. NajmComputational Intelligent Data Analysis for Biodiversity and Species ConservationMathematical Programming Applications to Land Conservation an...

  14. Diversity Performance Analysis on Multiple HAP Networks

    Directory of Open Access Journals (Sweden)

    Feihong Dong

    2015-06-01

    Full Text Available One of the main design challenges in wireless sensor networks (WSNs is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV. In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF and cumulative distribution function (CDF of the received signal-to-noise ratio (SNR are derived. In addition, the average symbol error rate (ASER with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  15. Mixed Methods Analysis of Enterprise Social Networks

    DEFF Research Database (Denmark)

    Behrendt, Sebastian; Richter, Alexander; Trier, Matthias

    2014-01-01

    The increasing use of enterprise social networks (ESN) generates vast amounts of data, giving researchers and managerial decision makers unprecedented opportunities for analysis. However, more transparency about the available data dimensions and how these can be combined is needed to yield accurate...

  16. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  17. Combining morphological analysis and Bayesian networks for ...

    African Journals Online (AJOL)

    Morphological analysis (MA) and Bayesian networks (BN) are two closely related modelling methods, each of which has its advantages and disadvantages for strategic decision support modelling. MA is a method for defining, linking and evaluating problem spaces. BNs are graphical models which consist of a qualitative ...

  18. Risk analysis of computer system designs

    Science.gov (United States)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  19. Models of network reliability analysis, combinatorics, and Monte Carlo

    CERN Document Server

    Gertsbakh, Ilya B

    2009-01-01

    Unique in its approach, Models of Network Reliability: Analysis, Combinatorics, and Monte Carlo provides a brief introduction to Monte Carlo methods along with a concise exposition of reliability theory ideas. From there, the text investigates a collection of principal network reliability models, such as terminal connectivity for networks with unreliable edges and/or nodes, network lifetime distribution in the process of its destruction, network stationary behavior for renewable components, importance measures of network elements, reliability gradient, and network optimal reliability synthesis

  20. A statistical framework for differential network analysis from microarray data

    Directory of Open Access Journals (Sweden)

    Datta Somnath

    2010-02-01

    Full Text Available Abstract Background It has been long well known that genes do not act alone; rather groups of genes act in consort during a biological process. Consequently, the expression levels of genes are dependent on each other. Experimental techniques to detect such interacting pairs of genes have been in place for quite some time. With the advent of microarray technology, newer computational techniques to detect such interaction or association between gene expressions are being proposed which lead to an association network. While most microarray analyses look for genes that are differentially expressed, it is of potentially greater significance to identify how entire association network structures change between two or more biological settings, say normal versus diseased cell types. Results We provide a recipe for conducting a differential analysis of networks constructed from microarray data under two experimental settings. At the core of our approach lies a connectivity score that represents the strength of genetic association or interaction between two genes. We use this score to propose formal statistical tests for each of following queries: (i whether the overall modular structures of the two networks are different, (ii whether the connectivity of a particular set of "interesting genes" has changed between the two networks, and (iii whether the connectivity of a given single gene has changed between the two networks. A number of examples of this score is provided. We carried out our method on two types of simulated data: Gaussian networks and networks based on differential equations. We show that, for appropriate choices of the connectivity scores and tuning parameters, our method works well on simulated data. We also analyze a real data set involving normal versus heavy mice and identify an interesting set of genes that may play key roles in obesity. Conclusions Examining changes in network structure can provide valuable information about the