WorldWideScience

Sample records for networked computer tools

  1. Enhancing the Understanding of Computer Networking Courses through Software Tools

    OpenAIRE

    Dafalla, Z. I.; Balaji, R. D.

    2015-01-01

    Computer networking is an important specialization in Information and Communication Technologies. However imparting the right knowledge to students can be a challenging task due to the fact that there is not enough time to deliver lengthy labs during normal lecture hours. Augmenting the use of physical machines with software tools help the students to learn beyond the limited lab sessions within the environment of higher Institutions of learning throughout the world. The Institutions focus mo...

  2. Computer-mediated-communication and social networking tools at work

    NARCIS (Netherlands)

    Ou, C.X.J.; Sia, C.L.; Hui, C.K.

    2013-01-01

    Purpose – Advances in information technology (IT) have resulted in the development of various computer‐mediated communication (CMC) and social networking tools. However, quantifying the benefits of utilizing these tools in the organizational context remains a challenge. In this study, the authors

  3. A Study on Parallel Computation Tools on Networked PCs

    Directory of Open Access Journals (Sweden)

    Heru Suhartanto

    2010-10-01

    Full Text Available Many models for natural phenomena, engineering applications and industries need powerfull computing resources to solve their problems. High Performance Computing resources were introduced by many researchers. This comes in the form of Supercomputers and with operating systems and tools for development such as parallel compiler and its library. However, these resources are  expensive for the investation and maintenance, hence people need some alternatives. Many people then introduced parallel distributed computing by using available computing resource such as PCs. Each of these PCs is treated  s a processors, hence the cluster of the PC behaves as Multiprocessors Computer. Many tools are developed for such purposes. This paper studies the peformance of the currently popular tools such as Parallel Virta\\ual Machine (PVM, Message Passing Interface (MPI, Java Remote Method Invocation (RMI and Java Common Object Request Broker Architecture (CORBA. Some experiments were conducted on a cluster of PCs, the results show significant speed up. Each of those tools are identified suitable for a certain implementation and programming purposes.

  4. Computational tools for large-scale biological network analysis

    OpenAIRE

    Pinto, José Pedro Basto Gouveia Pereira

    2012-01-01

    Tese de doutoramento em Informática The surge of the field of Bioinformatics, among other contributions, provided biological researchers with powerful computational methods for processing and analysing the large amount of data coming from recent biological experimental techniques such as genome sequencing and other omics. Naturally, this led to the opening of new avenues of biological research among which is included the analysis of large-scale biological networks. The an...

  5. Benchmarking selected computational gene network growing tools in context of virus-host interactions.

    Science.gov (United States)

    Taye, Biruhalem; Vaz, Candida; Tanavde, Vivek; Kuznetsov, Vladimir A; Eisenhaber, Frank; Sugrue, Richard J; Maurer-Stroh, Sebastian

    2017-07-19

    Several available online tools provide network growing functions where an algorithm utilizing different data sources suggests additional genes/proteins that should connect an input gene set into functionally meaningful networks. Using the well-studied system of influenza host interactions, we compare the network growing function of two free tools GeneMANIA and STRING and the commercial IPA for their performance of recovering known influenza A virus host factors previously identified from siRNA screens. The result showed that given small (~30 genes) or medium (~150 genes) input sets all three network growing tools detect significantly more known host factors than random human genes with STRING overall performing strongest. Extending the networks with all the three tools significantly improved the detection of GO biological processes of known host factors compared to not growing networks. Interestingly, the rate of identification of true host factors using computational network growing is equal or better to doing another experimental siRNA screening study which could also be true and applied to other biological pathways/processes.

  6. Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership

    Science.gov (United States)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.

  7. High-performance computing and networking as tools for accurate emission computed tomography reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Passeri, A. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); Formiconi, A.R. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); De Cristofaro, M.T.E.R. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); Pupi, A. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy); Meldolesi, U. [Dipartimento di Fisiopatologia Clinica - Sezione di Medicina Nucleare, Universita` di Firenze (Italy)

    1997-04-01

    It is well known that the quantitative potential of emission computed tomography (ECT) relies on the ability to compensate for resolution, attenuation and scatter effects. Reconstruction algorithms which are able to take these effects into account are highly demanding in terms of computing resources. The reported work aimed to investigate the use of a parallel high-performance computing platform for ECT reconstruction taking into account an accurate model of the acquisition of single-photon emission tomographic (SPET) data. An iterative algorithm with an accurate model of the variable system response was ported on the MIMD (Multiple Instruction Multiple Data) parallel architecture of a 64-node Cray T3D massively parallel computer. The system was organized to make it easily accessible even from low-cost PC-based workstations through standard TCP/IP networking. A complete brain study of 30 (64 x 64) slices could be reconstructed from a set of 90 (64 x 64) projections with ten iterations of the conjugate gradients algorithm in 9 s, corresponding to an actual speed-up factor of 135. This work demonstrated the possibility of exploiting remote high-performance computing and networking resources from hospital sites by means of low-cost workstations using standard communication protocols without particular problems for routine use. The achievable speed-up factors allow the assessment of the clinical benefit of advanced reconstruction techniques which require a heavy computational burden for the compensation effects such as variable spatial resolution, scatter and attenuation. The possibility of using the same software on the same hardware platform with data acquired in different laboratories with various kinds of SPET instrumentation is appealing for software quality control and for the evaluation of the clinical impact of the reconstruction methods. (orig.). With 4 figs., 1 tab.

  8. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    Science.gov (United States)

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  9. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  10. Experimental and computational tools useful for (re)construction of dynamic kinase-substrate networks

    DEFF Research Database (Denmark)

    Tan, Chris Soon Heng; Linding, Rune

    2009-01-01

    The explosion of site- and context-specific in vivo phosphorylation events presents a potentially rich source of biological knowledge and calls for novel data analysis and modeling paradigms. Perhaps the most immediate challenge is delineating detected phosphorylation sites to their effector...... kinases. This is important for (re)constructing transient kinase-substrate interaction networks that are essential for mechanistic understanding of cellular behaviors and therapeutic intervention, but has largely eluded high-throughput protein-interaction studies due to their transient nature and strong...... dependencies on cellular context. Here, we surveyed some of the computational approaches developed to dissect phosphorylation data detected in systematic proteomic experiments and reviewed some experimental and computational approaches used to map phosphorylation sites to their effector kinases in efforts...

  11. Computer networks monitoring

    OpenAIRE

    Antončič , Polona

    2012-01-01

    The present thesis entitled Computer Networks Monitoring introduces the basics of computer networks, the aim and the computer data reclamation from networking devices, software for the system follow-up together with the case of monitoring a real network with tens of network devices. The networks represent an important part in the modern information technology and serve for the exchange of data and sources which makes their impeccability of crucial importance. Correct and efficient sys...

  12. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  13. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  14. Experimental and computational tools for analysis of signaling networks in primary cells

    DEFF Research Database (Denmark)

    Schoof, Erwin M; Linding, Rune

    2014-01-01

    , or differentiation. Protein phosphorylation events play a major role in this process and are often involved in fundamental biological and cellular processes such as protein-protein interactions, enzyme activity, and immune responses. Determining which kinases phosphorylate specific phospho sites poses a challenge......; this information is critical when trying to elucidate key proteins involved in specific cellular responses. Here, methods to generate high-quality quantitative phosphorylation data from cell lysates originating from primary cells, and how to analyze the generated data to construct quantitative signaling network...

  15. A computer network with SCADA and case tools for on-line process control in greenhouses.

    Science.gov (United States)

    Gieling ThH; van Meurs WTh; Janssen, H J

    1996-01-01

    Climate control computers in greenhouses are used to control heating and ventilation, supply water and dilute and dispense nutrients. They integrate models into optimally controlled systems. This paper describes how information technology, as in use in other sectors of industry, is applied to greenhouse control. The introduction of modern software and hardware concepts in horticulture adds power and extra oppurtunities to climate contol in greenhouses.

  16. Basics of Computer Networking

    CERN Document Server

    Robertazzi, Thomas

    2012-01-01

    Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

  17. Groupware: A Tool for Interpersonal Computing.

    Science.gov (United States)

    Knupfer, Nancy Nelson; McLellan, Hilary

    Computer networks have provided a foundation for interpersonal computing, and new tools are emerging, the centerpiece of which is called "groupware." Groupware technology is reviewed, and the theoretical framework that will underlie interpersonal collaborative computing is discussed. Groupware can consist of hardware, software, services,…

  18. Computer network defense system

    Science.gov (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    2017-08-22

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  19. Trace Replay and Network Simulation Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    TraceR Is a trace replay tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performance and understanding network behavior by simulating messaging In High Performance Computing applications on interconnection networks.

  20. Computer-communication networks

    CERN Document Server

    Meditch, James S

    1983-01-01

    Computer- Communication Networks presents a collection of articles the focus of which is on the field of modeling, analysis, design, and performance optimization. It discusses the problem of modeling the performance of local area networks under file transfer. It addresses the design of multi-hop, mobile-user radio networks. Some of the topics covered in the book are the distributed packet switching queuing network design, some investigations on communication switching techniques in computer networks and the minimum hop flow assignment and routing subject to an average message delay constraint

  1. Hyperswitch Communication Network Computer

    Science.gov (United States)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  2. Formal Methods for Information Protection Technology. Task 1: Formal Grammar-Based Approach and Tool for Simulation Attacks against Computer Network. Part 1

    National Research Council Canada - National Science Library

    Karsayev, O

    2004-01-01

    .... Integrity, confidentiality and availability of the network resources must be assured. To detect and suppress different types of computer unauthorized intrusions, modern network security systems (NSS...

  3. A computer tool to support in design of industrial Ethernet.

    Science.gov (United States)

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues

    2009-04-01

    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  4. Computer Networks and Globalization

    Directory of Open Access Journals (Sweden)

    J. Magliaro

    2007-07-01

    Full Text Available Communication and information computer networks connect the world in ways that make globalization more natural and inequity more subtle. As educators, we look at these phenomena holistically analyzing them from the realist’s view, thus exploring tensions, (in equity and (injustice, and from the idealist’s view, thus embracing connectivity, convergence and development of a collective consciousness. In an increasingly market- driven world we find examples of openness and human generosity that are based on networks, specifically the Internet. After addressing open movements in publishing, software industry and education, we describe the possibility of a dialectic equilibrium between globalization and indigenousness in view of ecologically designed future smart networks

  5. Computational tools for the modern andrologist.

    Science.gov (United States)

    Niederberger, C

    1996-01-01

    With such a wide array of computational tools to solve inference problems, andrologists and their mathematical or statistical collaborators face perhaps bewildering choices. It is tempting to criticize a method with which one is unfamiliar for its apparent complexity. Yet, many methods are quite elegant; neural computation uses nature's own best biological classifier, for example, and genetic algorithms apply rules of natural selection. Computer scientists will likely find no one single best inference engine to solve all classification problems. Rather, the modeler should choose the most appropriate computational tool based on the specific nature of a problem. If the problem can be separated into obvious components, a Markov chain may be useful. If the andrologist would like to encode a well-known clinical algorithm into the computer, the programmer may use an expert system. Once a modeler builds an inference engine, that engine is not truly useful until other andrologists use it to make inferences with their own data. Because a wide variety of computer hardware and software exists, it is a significant endeavor to translate, or "port," software designed and built on one machine to many other different computers. Fortunately, the World Wide Web offers a means by which computational tools may be made directly available to multiple users on many different systems, or "platforms." The World Wide Web refers to a standardization of information traffic on the global computer network, the Internet. The Internet is simply the linkage of many computers worldwide by computer operators who have chosen to allow other users access to their systems. Because many different types of computers exist, until recently only communication in very rudimentary form, such as text, or between select compatible machines, was available. Within the last half-decade, computer scientists and operators began to use standard means of communication between computers. Interpreters of these standard

  6. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  7. GraphCrunch: A tool for large network analyses

    Directory of Open Access Journals (Sweden)

    Pržulj Nataša

    2008-01-01

    Full Text Available Abstract Background The recent explosion in biological and other real-world network data has created the need for improved tools for large network analyses. In addition to well established global network properties, several new mathematical techniques for analyzing local structural properties of large networks have been developed. Small over-represented subgraphs, called network motifs, have been introduced to identify simple building blocks of complex networks. Small induced subgraphs, called graphlets, have been used to develop "network signatures" that summarize network topologies. Based on these network signatures, two new highly sensitive measures of network local structural similarities were designed: the relative graphlet frequency distance (RGF-distance and the graphlet degree distribution agreement (GDD-agreement. Finding adequate null-models for biological networks is important in many research domains. Network properties are used to assess the fit of network models to the data. Various network models have been proposed. To date, there does not exist a software tool that measures the above mentioned local network properties. Moreover, none of the existing tools compare real-world networks against a series of network models with respect to these local as well as a multitude of global network properties. Results Thus, we introduce GraphCrunch, a software tool that finds well-fitting network models by comparing large real-world networks against random graph models according to various network structural similarity measures. It has unique capabilities of finding computationally expensive RGF-distance and GDD-agreement measures. In addition, it computes several standard global network measures and thus supports the largest variety of network measures thus far. Also, it is the first software tool that compares real-world networks against a series of network models and that has built-in parallel computing capabilities allowing for a user

  8. GraphCrunch: a tool for large network analyses.

    Science.gov (United States)

    Milenković, Tijana; Lai, Jason; Przulj, Natasa

    2008-01-30

    The recent explosion in biological and other real-world network data has created the need for improved tools for large network analyses. In addition to well established global network properties, several new mathematical techniques for analyzing local structural properties of large networks have been developed. Small over-represented subgraphs, called network motifs, have been introduced to identify simple building blocks of complex networks. Small induced subgraphs, called graphlets, have been used to develop "network signatures" that summarize network topologies. Based on these network signatures, two new highly sensitive measures of network local structural similarities were designed: the relative graphlet frequency distance (RGF-distance) and the graphlet degree distribution agreement (GDD-agreement). Finding adequate null-models for biological networks is important in many research domains. Network properties are used to assess the fit of network models to the data. Various network models have been proposed. To date, there does not exist a software tool that measures the above mentioned local network properties. Moreover, none of the existing tools compare real-world networks against a series of network models with respect to these local as well as a multitude of global network properties. Thus, we introduce GraphCrunch, a software tool that finds well-fitting network models by comparing large real-world networks against random graph models according to various network structural similarity measures. It has unique capabilities of finding computationally expensive RGF-distance and GDD-agreement measures. In addition, it computes several standard global network measures and thus supports the largest variety of network measures thus far. Also, it is the first software tool that compares real-world networks against a series of network models and that has built-in parallel computing capabilities allowing for a user specified list of machines on which to perform

  9. Design Tools for Large Networks

    CERN Document Server

    Panikashvili, E; 15th IEEE Real Time Conference 2007

    2007-01-01

    The Atlas TDAQ network consists of four separate networks which together total over 4000 ports at 1Gb/s, 40 ports at 10Gbps, 200 Ethernet switches at the edge of the network, 6 multiblade chassis switches at the core and over 20 km of copper and fiber cabling. The NetDesign suite was developed to provide a comprehensive set of design tools that permits the generation of a navigable circuit design, automates a number of routine tasks and provides the user with a powerful and flexible reporting system. Netdesign is an addon extension to the basic functionality of the Microsoft Visio CAD tool. It features: automatic labeling of cables, media / port connection validation, navigation along a cable, schematic hierarchical, cross-page and vertical navigation, overall verification of network diagram and hyperlinks to other equipment databases. The internal network structure can be exported to a 3rd party database from which a user-friendly meta-language is used to process a large variety of reports on the network des...

  10. Computer-aided translation tools

    DEFF Research Database (Denmark)

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...... when Danish is one of the languages, though some also express willingness to consider using MT (more) when output quality improves. Most respondents report that CAT has changed the translation industry, mentioning that the technology facilitates improved productivity and consistency, but also...

  11. Foundational Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  12. Computing and networking at JINR

    CERN Document Server

    Zaikin, N S; Strizh, T A

    2001-01-01

    This paper describes the computing and networking facilities at the Joint Institute for Nuclear Research. The Joint Institute for Nuclear Research (JINR) is an international intergovernmental organization located in Dubna, a small town on the bank of the Volga river 120 km north from Moscow. At present JINR has 18 Member States. The Institute consists of 7 scientific Laboratories and some subdivisions. JINR has scientific cooperation with such scientific centres as CERN, FNAL, DESY etc. and is equipped with the powerful and fast computation means integrated into the worldwide computer networks. The Laboratory of Information Technologies (LIT) is responsible for Computing and Networking at JINR. (5 refs).

  13. Administration of remote computer networks

    OpenAIRE

    Fjeldbo, Stig Jarle

    2005-01-01

    Master i nettverks- og systemadministrasjon Today's computer networks have gone from typically being a small local area network, to wide area networks, where users and servers are interconnected with each other from all over the world. This development has gradually expanded as bandwidth has become higher and cheaper. But when dealing with the network traffic, bandwidth is only one of the important properties. Delay, jitter and reliability are also important properties for t...

  14. Understanding and designing computer networks

    CERN Document Server

    King, Graham

    1995-01-01

    Understanding and Designing Computer Networks considers the ubiquitous nature of data networks, with particular reference to internetworking and the efficient management of all aspects of networked integrated data systems. In addition it looks at the next phase of networking developments; efficiency and security are covered in the sections dealing with data compression and data encryption; and future examples of network operations, such as network parallelism, are introduced.A comprehensive case study is used throughout the text to apply and illustrate new techniques and concepts as th

  15. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  16. Developing security tools of WSN and WBAN networks applications

    CERN Document Server

    A M El-Bendary, Mohsen

    2015-01-01

    This book focuses on two of the most rapidly developing areas in wireless technology (WT) applications, namely, wireless sensors networks (WSNs) and wireless body area networks (WBANs). These networks can be considered smart applications of the recent WT revolutions. The book presents various security tools and scenarios for the proposed enhanced-security of WSNs, which are supplemented with numerous computer simulations. In the computer simulation section, WSN modeling is addressed using MATLAB programming language.

  17. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  18. Risks in Networked Computer Systems

    OpenAIRE

    Klingsheim, André N.

    2008-01-01

    Networked computer systems yield great value to businesses and governments, but also create risks. The eight papers in this thesis highlight vulnerabilities in computer systems that lead to security and privacy risks. A broad range of systems is discussed in this thesis: Norwegian online banking systems, the Norwegian Automated Teller Machine (ATM) system during the 90's, mobile phones, web applications, and wireless networks. One paper also comments on legal risks to bank cust...

  19. Wireless Computational Networking Architectures

    Science.gov (United States)

    2013-12-01

    2] T. Ho, M. Medard, R. Kotter , D. Karger, M. Effros, J. Shi, and B. Leong, “A Random Linear Network Coding Approach to Multicast,” IEEE...218, January 2008. [10] R. Kotter and F. R. Kschischang, “Coding for Errors and Erasures in Random Network Coding,” IEEE Transactions on...Systems, Johns Hopkins University, Baltimore, Maryland, 2011. 6. B. W. Suter and Z. Yan U.S. Patent Pending 13/949,319 Rank Deficient Decoding

  20. Computing with Spiking Neuron Networks

    NARCIS (Netherlands)

    H. Paugam-Moisy; S.M. Bohte (Sander); G. Rozenberg; T.H.W. Baeck (Thomas); J.N. Kok (Joost)

    2012-01-01

    htmlabstractAbstract Spiking Neuron Networks (SNNs) are often referred to as the 3rd gener- ation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an ac- curate modeling of synaptic interactions

  1. A fast tool for minimum hybridization networks.

    Science.gov (United States)

    Chen, Zhi-Zhong; Wang, Lusheng; Yamanaka, Satoshi

    2012-07-02

    Due to hybridization events in evolution, studying two different genes of a set of species may yield two related but different phylogenetic trees for the set of species. In this case, we want to combine the two phylogenetic trees into a hybridization network with the fewest hybridization events. This leads to three computational problems, namely, the problem of computing the minimum size of a hybridization network, the problem of constructing one minimum hybridization network, and the problem of enumerating a representative set of minimum hybridization networks. The previously best software tools for these problems (namely, Chen and Wang's HybridNet and Albrecht et al.'s Dendroscope 3) run very slowly for large instances that cannot be reduced to relatively small instances. Indeed, when the minimum size of a hybridization network of two given trees is larger than 23 and the problem for the trees cannot be reduced to relatively smaller independent subproblems, then HybridNet almost always takes longer than 1 day and Dendroscope 3 often fails to complete. Thus, a faster software tool for the problems is in need. We develop a software tool in ANSI C, named FastHN, for the following problems: Computing the minimum size of a hybridization network, constructing one minimum hybridization network, and enumerating a representative set of minimum hybridization networks. We obtain FastHN by refining HybridNet with three ideas. The first idea is to preprocess the input trees so that the trees become smaller or the problem becomes to solve two or more relatively smaller independent subproblems. The second idea is to use a fast algorithm for computing the rSPR distance of two given phylognetic trees to cut more branches of the search tree in the exhaustive-search stage of the algorithm. The third idea is that during the exhaustive-search stage of the algorithm, we find two sibling leaves in one of the two forests (obtained from the given trees by cutting some edges) such that

  2. A fast tool for minimum hybridization networks

    Directory of Open Access Journals (Sweden)

    Chen Zhi-Zhong

    2012-07-01

    Full Text Available Abstract Background Due to hybridization events in evolution, studying two different genes of a set of species may yield two related but different phylogenetic trees for the set of species. In this case, we want to combine the two phylogenetic trees into a hybridization network with the fewest hybridization events. This leads to three computational problems, namely, the problem of computing the minimum size of a hybridization network, the problem of constructing one minimum hybridization network, and the problem of enumerating a representative set of minimum hybridization networks. The previously best software tools for these problems (namely, Chen and Wang’s HybridNet and Albrecht et al.’s Dendroscope 3 run very slowly for large instances that cannot be reduced to relatively small instances. Indeed, when the minimum size of a hybridization network of two given trees is larger than 23 and the problem for the trees cannot be reduced to relatively smaller independent subproblems, then HybridNet almost always takes longer than 1 day and Dendroscope 3 often fails to complete. Thus, a faster software tool for the problems is in need. Results We develop a software tool in ANSI C, named FastHN, for the following problems: Computing the minimum size of a hybridization network, constructing one minimum hybridization network, and enumerating a representative set of minimum hybridization networks. We obtain FastHN by refining HybridNet with three ideas. The first idea is to preprocess the input trees so that the trees become smaller or the problem becomes to solve two or more relatively smaller independent subproblems. The second idea is to use a fast algorithm for computing the rSPR distance of two given phylognetic trees to cut more branches of the search tree in the exhaustive-search stage of the algorithm. The third idea is that during the exhaustive-search stage of the algorithm, we find two sibling leaves in one of the two forests (obtained from

  3. Computational Social Network Analysis

    CERN Document Server

    Hassanien, Aboul-Ella

    2010-01-01

    Presents insight into the social behaviour of animals (including the study of animal tracks and learning by members of the same species). This book provides web-based evidence of social interaction, perceptual learning, information granulation and the behaviour of humans and affinities between web-based social networks

  4. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  5. Network Patch Cables Demystified: A Super Activity for Computer Networking Technology

    Science.gov (United States)

    Brown, Douglas L.

    2004-01-01

    This article de-mystifies network patch cable secrets so that people can connect their computers and transfer those pesky files--without screaming at the cables. It describes a network cabling activity that can offer students a great hands-on opportunity for working with the tools, techniques, and media used in computer networking. Since the…

  6. DETECTING NETWORK ATTACKS IN COMPUTER NETWORKS BY USING DATA MINING METHODS

    OpenAIRE

    Platonov, V. V.; Semenov, P. O.

    2016-01-01

    The article describes an approach to the development of an intrusion detection system for computer networks. It is shown that the usage of several data mining methods and tools can improve the efficiency of protection computer networks against network at-tacks due to the combination of the benefits of signature detection and anomalies detection and the opportunity of adaptation the sys-tem for hardware and software structure of the computer network.

  7. A software tool for network intrusion detection

    CSIR Research Space (South Africa)

    Van der Walt, C

    2012-10-01

    Full Text Available This presentation illustrates how a recently developed software tool enables operators to easily monitor a network and detect intrusions without requiring expert knowledge of network intrusion detections....

  8. Computer Network Security- The Challenges of Securing a Computer Network

    Science.gov (United States)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  9. Computing chemical organizations in biological networks.

    Science.gov (United States)

    Centler, Florian; Kaleta, Christoph; di Fenizio, Pietro Speroni; Dittrich, Peter

    2008-07-15

    Novel techniques are required to analyze computational models of intracellular processes as they increase steadily in size and complexity. The theory of chemical organizations has recently been introduced as such a technique that links the topology of biochemical reaction network models to their dynamical repertoire. The network is decomposed into algebraically closed and self-maintaining subnetworks called organizations. They form a hierarchy representing all feasible system states including all steady states. We present three algorithms to compute the hierarchy of organizations for network models provided in SBML format. Two of them compute the complete organization hierarchy, while the third one uses heuristics to obtain a subset of all organizations for large models. While the constructive approach computes the hierarchy starting from the smallest organization in a bottom-up fashion, the flux-based approach employs self-maintaining flux distributions to determine organizations. A runtime comparison on 16 different network models of natural systems showed that none of the two exhaustive algorithms is superior in all cases. Studying a 'genome-scale' network model with 762 species and 1193 reactions, we demonstrate how the organization hierarchy helps to uncover the model structure and allows to evaluate the model's quality, for example by detecting components and subsystems of the model whose maintenance is not explained by the model. All data and a Java implementation that plugs into the Systems Biology Workbench is available from http://www.minet.uni-jena.de/csb/prj/ot/tools.

  10. Data Logistics in Network Computing

    CERN Multimedia

    CERN. Geneva; Marquina, Miguel Angel

    2005-01-01

    In distributed computing environments, performance is often dominated by the time that it takes to move data over a network. In the case of data-centric applications, or Data Grids, this problem of data movement becomes one of the overriding concerns. This talk describes techniques for improving data movement in Grid environments that we refer to as 'logistics.' We demonstrate that by using storage and cooperative forwarding 'in' the network, we can improve end to end throughput in many cases. Our approach offers clear performance benefits for high-bandwidth, high-latency networks. This talk will introduce the Logistical Session Layer (LSL) and provide experimental results from that system.

  11. Collective network for computer structures

    Energy Technology Data Exchange (ETDEWEB)

    Blumrich, Matthias A [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Chen, Dong [Croton On Hudson, NY; Gara, Alan [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Takken, Todd E [Brewster, NY; Steinmacher-Burow, Burkhard D [Wernau, DE; Vranas, Pavlos M [Bedford Hills, NY

    2011-08-16

    A system and method for enabling high-speed, low-latency global collective communications among interconnected processing nodes. The global collective network optimally enables collective reduction operations to be performed during parallel algorithm operations executing in a computer structure having a plurality of the interconnected processing nodes. Router devices ate included that interconnect the nodes of the network via links to facilitate performance of low-latency global processing operations at nodes of the virtual network and class structures. The global collective network may be configured to provide global barrier and interrupt functionality in asynchronous or synchronized manner. When implemented in a massively-parallel supercomputing structure, the global collective network is physically and logically partitionable according to needs of a processing algorithm.

  12. Visualization Tools for Teaching Computer Security

    Science.gov (United States)

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  13. Optimal monitoring of computer networks

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.V.; Flanagan, D.

    1997-08-01

    The authors apply the ideas from optimal design theory to the very specific area of monitoring large computer networks. The behavior of these networks is so complex and uncertain that it is quite natural to use the statistical methods of experimental design which were originated in such areas as biology, behavioral sciences and agriculture, where the random character of phenomena is a crucial component and systems are too complicated to be described by some sophisticated deterministic models. They want to emphasize that only the first steps have been completed, and relatively simple underlying concepts about network functions have been used. Their immediate goal is to initiate studies focused on developing efficient experimental design techniques which can be used by practitioners working with large networks operating and evolving in a random environment.

  14. Computational medicine tools and challenges

    CERN Document Server

    Trajanoski, Zlatko

    2014-01-01

    This book covers a number of contemporary computational medicine topics spanning scales from molecular to cell to organ and organism, presenting a state-of-the-art IT infrastructure, and reviewing four hierarchical scales.

  15. COMPUTER AIDED DESIGN OF CUTTING TOOLS

    Directory of Open Access Journals (Sweden)

    Jakub Matuszak

    2015-11-01

    Full Text Available Correct and stable machining process requires an appropriate cutting tool. In most cases the tool can be selected by using special tool catalogs often available in online version. But in some cases there is a need to design unusual tools, for special treatment, which are not available in tool manufacturers’ catalogs. Proper tool design requires strength and geometric calculations. Moreover, in many cases specific technical documentation is required. By using Computer Aided Design of cutting tools this task can be carried out quickly and with high accuracy. Cutting tool visualization in CAD programs gives a clear overview of the design process. Besides, these programs provide the ability to simulate real machining process. Nowadays, 3D modeling in CAD programs is a fundamental tool for engineers. Therefore, it is important to use them in the education process.

  16. Markov Networks in Evolutionary Computation

    CERN Document Server

    Shakya, Siddhartha

    2012-01-01

    Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs).  EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current researc...

  17. Networks as Tools for Sustainable Urban Development

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Tollin, Nicola

    Due to the increasing number of networks related to sustainable development (SUD) the paper focuses on understanding in which way networks can be considered useful tools for sustainable urban development, taking particularly into consideration the networks potential of spreading innovative policies......, strategies and actions. There has been little theoretically development on the subject. In practice networks for sustainable development can be seen as combining different theoretical approaches to networks, including governance, urban competition and innovation. To give a picture of the variety...... of sustainable networks, we present different examples of networks, operating at different geographical scales, from global to local, with different missions (organizational, political, technical), fields (lobbying, learning, branding) and its size. The potentials and challenges related to sustainable networks...

  18. Networks as Tools for Urban Sustainability

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Tollin, Nicola

    2004-01-01

    Due to the increasing number of networks related to sustainable development (SUD) the paper focuses on understanding in which way networks can be considered useful tools for sustainable urban development, taking particularly into consideration the networks potential of spreading innovative policies......, strategies and actions. There has been little theoretically development on the subject. In practice networks for sustainable development can be seen as combining different theoretical approaches to networks, including governance, urban competition and innovation. To give a picture of the variety...... of sustainable networks, we present different examples of networks, operating at different geographical scales, from global to local, with different missions (organizational, political, technical), fields (lobbying, learning, branding) and its size. The potentials and challenges related to sustainable networks...

  19. Enhancing Classroom Effectiveness through Social Networking Tools

    Science.gov (United States)

    Kurthakoti, Raghu; Boostrom, Robert E., Jr.; Summey, John H.; Campbell, David A.

    2013-01-01

    To determine the usefulness of social networking Web sites such as Ning.com as a communication tool in marketing courses, a study was designed with special concern for social network use in comparison to Blackboard. Students from multiple marketing courses were surveyed. Assessments of Ning.com and Blackboard were performed both to understand how…

  20. Conceptual Aspects in the Modeling of Logistical Risk of the Networked Information Economy with the Use of Tools of Natural Computing

    Directory of Open Access Journals (Sweden)

    Vitlinskyy Valdemar V.

    2016-11-01

    Full Text Available Information and communication tools and technologies are rapidly changing daily lives of people and business processes in economic activity primarily in the field of logistics. In particular, the innovative nature of these transformations leads to the emergence of new logistical risks, changing the essence of the existing ones, which needs to be taken into account in the management of logistics systems at various levels. Besides, the problem of Big Data has become increasingly urgent, which, on the one hand, can improve the validity of making managerial decisions, on the other hand — they (Big Data require modern tools for their production, processing and analysis. As such tools there can be used methods and models of natural computing. In the paper the basics of ant and bee algorithms, the particle swarm method, artificial immune systems are summarized; the possibilities of their application in the modeling of various types of logistical risk are demonstrated, the formalization of the problem of risk modeling with the use of an artificial immune system being given as a conditional example.

  1. Computational design tools for synthetic biology.

    Science.gov (United States)

    Marchisio, Mario A; Stelling, Jörg

    2009-08-01

    Computer-aided design, pervasive in other engineering disciplines, is currently developing in synthetic biology. Concepts for standardization and hierarchies of parts, devices and systems provide a basis for efficient engineering in biology. Recently developed computational tools, for instance, enable rational (and graphical) composition of genetic circuits from standard parts, and subsequent simulation for testing the predicted functions in silico. The computational design of DNA and proteins with predetermined quantitative functions has made similar advances. The biggest challenge, however, is the integration of tools and methods into powerful and intuitively usable workflows-and the field is only starting to address it.

  2. Social Networking Tools for Academic Libraries

    Science.gov (United States)

    Chu, Samuel Kai-Wah; Du, Helen S.

    2013-01-01

    This is an exploratory study investigating the use of social networking tools in academic libraries, examining the extent of their use, library staff's perceptions of their usefulness and challenges, and factors influencing decisions to use or not to use such tools. Invitations to participate in a web-based survey were sent to 140 university…

  3. A Multilayer Model of Computer Networks

    OpenAIRE

    Shchurov, Andrey A.

    2015-01-01

    The fundamental concept of applying the system methodology to network analysis declares that network architecture should take into account services and applications which this network provides and supports. This work introduces a formal model of computer networks on the basis of the hierarchical multilayer networks. In turn, individual layers are represented as multiplex networks. The concept of layered networks provides conditions of top-down consistency of the model. Next, we determined the...

  4. Biological networks 101: computational modeling for molecular biologists

    NARCIS (Netherlands)

    Scholma, Jetse; Schivo, Stefano; Urquidi Camacho, Ricardo A.; van de Pol, Jan Cornelis; Karperien, Hermanus Bernardus Johannes; Post, Janine Nicole

    2014-01-01

    Computational modeling of biological networks permits the comprehensive analysis of cells and tissues to define molecular phenotypes and novel hypotheses. Although a large number of software tools have been developed, the versatility of these tools is limited by mathematical complexities that

  5. Personal computer local networks report

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. Since the first microcomputer local networks of the late 1970's and early 80's, personal computer LANs have expanded in popularity, especially since the introduction of IBMs first PC in 1981. The late 1980s has seen a maturing in the industry with only a few vendors maintaining a large share of the market. This report is intended to give the reader a thorough understanding of the technology used to build these systems ... from cable to chips ... to ... protocols to servers. The report also fully defines PC LANs and the marketplace, with in-

  6. GSSIM – A Tool for Distributed Computing Experiments

    Directory of Open Access Journals (Sweden)

    Sławomir Bąk

    2011-01-01

    Full Text Available In this paper we present the Grid Scheduling Simulator (GSSIM, a comprehensive and advanced simulation tool for distributed computing problems. Based on a classification of simulator features proposed in the paper, we define problems that can be simulated using GSSIM and compare it to other simulation tools. We focus on an extension of our previous works including advanced workload generation methods, simulation of a network with advance reservation features, handling specific application performance models and energy efficiency modeling. Some important features of GSSIM are demonstrated by three diverse experiments conducted with the use of the tool. We also present an advanced web tool for the remote management and execution of simulation experiments, which makes GSSIM the comprehensive distributed computing simulator available on the Web.

  7. Network management tools for a GPS datalink network

    Science.gov (United States)

    Smyth, Padhraic; Chauvin, Todd; Oliver, Gordon; Statman, Joseph

    1991-01-01

    The availability of GPS (Global Position Satellite) information in real-time via a datalink system is shown to significantly increase the capacity of flight test and training ranges in terms of missions supported. This increase in mission activity. imposes demands on mission planning in the range-operations environment. In this context, network management tools which can improve the capability of range personnel to plan, monitor, and control network resources, are of significant interest. The application of both simulation and artificial intelligence techniques is described to develop such network managements tools.

  8. Terminal-oriented computer-communication networks.

    Science.gov (United States)

    Schwartz, M.; Boorstyn, R. R.; Pickholtz, R. L.

    1972-01-01

    Four examples of currently operating computer-communication networks are described in this tutorial paper. They include the TYMNET network, the GE Information Services network, the NASDAQ over-the-counter stock-quotation system, and the Computer Sciences Infonet. These networks all use programmable concentrators for combining a multiplicity of terminals. Included in the discussion for each network is a description of the overall network structure, the handling and transmission of messages, communication requirements, routing and reliability consideration where applicable, operating data and design specifications where available, and unique design features in the area of computer communications.

  9. Regional Use of Social Networking Tools

    Science.gov (United States)

    2014-12-01

    4 2.1.7 Tumblr 4 2.1.8 Instagram 4 2.2 Local Social Networking Services 5 3 Regional Preferences for Social Networking Tools 6 4 African Region...YouTube 280 million Twitter 255 million LinkedIn n/a Pinterest n/a Tumblr 300 million Instagram 200 million The active-user base numbers...so this percentage may decline in the future. 2.1.8 Instagram Instagram , acquired by Facebook in 2012, is a mobile social networking service that

  10. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  11. A Study of the Impact of Virtualization on the Computer Networks

    OpenAIRE

    Timalsena, Pratik

    2013-01-01

    Virtualization is an imminent sector of the Information and Technology in the peresent world. It is advancing and being popuraly implemented world wide. Computer network is not isolated from the global impact of the virtualization. The virtualization is being deployed on the computer networks in a great extent. In general, virtualization is an inevitable tool for computer networks. This report presents a surfacial idea about the impact of the virtualization on the computer network. The report...

  12. Computer Network Defense Through Radial Wave Functions

    OpenAIRE

    Malloy, Ian

    2016-01-01

    The purpose of this research was to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has be...

  13. Social Networking Sites as a Learning Tool

    Science.gov (United States)

    Sanchez-Casado, Noelia; Cegarra Navarro, Juan Gabriel; Wensley, Anthony; Tomaseti-Solano, Eva

    2016-01-01

    Purpose: Over the past few years, social networking sites (SNSs) have become very useful for firms, allowing companies to manage the customer-brand relationships. In this context, SNSs can be considered as a learning tool because of the brand knowledge that customers develop from these relationships. Because of the fact that knowledge in…

  14. Computer network and knowledge sharing. Computer network to chishiki kyoyu

    Energy Technology Data Exchange (ETDEWEB)

    Yoshimura, S. (The University of Tokyo, Tokyo (Japan))

    1991-10-20

    The infomation system has changed from the on-line data base as a simple knowledge sharing, used in the times when devices were expensive, to dialogue type approaches as a result of TSS advancement. This paper describes the advantages in and methods of utilizing personal computer communications from the standpoint of a person engaged in chemistry education. The electronic mail has a number of advatages; you can reach a person as immediately as in the telephone but need not to interrupt the receiver primes work, you can get to it more easily than writing a letter. Particularly the electronic signboard has a large living know-how effect that ''someone who happens to know it can answer''. The Japan Chemical Society has opened the ''Square of Chemistry'' in the NIFTY Serve. Although the Society provides information, it is important that the participants make proposals positively and provide topics. Such a network is expanding to a woridwide scale.

  15. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks; Diseno de una herramienta de computo para la espectrometria y dosimetria de neutrones por medio de redes neuronales evolutivas

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Martinez B, M. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Gallego, E. [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, Jose Gutierrez Abascal No. 2, E-28006 Madrid (Spain)], e-mail: morvymmyahoo@com.mx

    2009-10-15

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  16. Learning Words through Computer-Adaptive Tool

    DEFF Research Database (Denmark)

    Zhang, Chun

    2005-01-01

    construction, I stress the design of a test theory, namely, a learning algorithm. The learning algorithm is designed under such principles that users experience both 'elaborative rehearsal’ (aspects in receptive and productive learning) and 'expanding rehearsal, (memory-based learning and repetitive act...... the category of L2 lexical learning in computer-adaptive learning environment. The reason to adopt computer-adaptive tool in WPG is based on the following premises: 1. Lexical learning is incremental in nature. 2. Learning can be measured precisely with tests (objectivist epistemology). In the course of WPG......). These design principles are coupled with cognitive approaches for design and analysis of learning and instruction in lexical learning....

  17. Mobile Agents in Networking and Distributed Computing

    CERN Document Server

    Cao, Jiannong

    2012-01-01

    The book focuses on mobile agents, which are computer programs that can autonomously migrate between network sites. This text introduces the concepts and principles of mobile agents, provides an overview of mobile agent technology, and focuses on applications in networking and distributed computing.

  18. BGen: A UML Behavior Network Generator Tool

    Science.gov (United States)

    Huntsberger, Terry; Reder, Leonard J.; Balian, Harry

    2010-01-01

    BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.

  19. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  20. Automated classification of computer network attacks

    CSIR Research Space (South Africa)

    Van Heerden, R

    2013-11-01

    Full Text Available In this paper we demonstrate how an automated reasoner, HermiT, is used to classify instances of computer network based attacks in conjunction with a network attack ontology. The ontology describes different types of network attacks through classes...

  1. Computational tools to investigate genetic cardiac channelopathies

    Science.gov (United States)

    Abriel, Hugues; de Lange, Enno; Kucera, Jan P.; Loussouarn, Gildas; Tarek, Mounir

    2013-01-01

    The aim of this perspective article is to share with the community of ion channel scientists our thoughts and expectations regarding the increasing role that computational tools will play in the future of our field. The opinions and comments detailed here are the result of a 3-day long international exploratory workshop that took place in October 2013 and that was supported by the Swiss National Science Foundation. PMID:24421770

  2. WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks

    Science.gov (United States)

    Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.

    2013-01-01

    A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…

  3. Final Report: Correctness Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  4. Integrating network awareness in ATLAS distributed computing

    CERN Document Server

    De, K; The ATLAS collaboration; Klimentov, A; Maeno, T; Mckee, S; Nilsson, P; Petrosyan, A; Vukotic, I; Wenaus, T

    2014-01-01

    A crucial contributor to the success of the massively scaled global computing system that delivers the analysis needs of the LHC experiments is the networking infrastructure upon which the system is built. The experiments have been able to exploit excellent high-bandwidth networking in adapting their computing models for the most efficient utilization of resources. New advanced networking technologies now becoming available such as software defined networks hold the potential of further leveraging the network to optimize workflows and dataflows, through proactive control of the network fabric on the part of high level applications such as experiment workload management and data management systems. End to end monitoring of networking and data flow performance further allows applications to adapt based on real time conditions. We will describe efforts underway in ATLAS on integrating network awareness at the application level, particularly in workload management.

  5. Network Management of the SPLICE Computer Network.

    Science.gov (United States)

    1982-12-01

    and the Lawrence Livermore Nttionl Laboratory Octopus lietwork [Ref. 24]. Additionally, the :oiex Distributed Network Coatrol Systems 200 and 330...Alexander A., litftqiifivl 93 24. University of Calif~cnia Lavr i ce LJ~vermoce Laboratory Letter Wloe Requa): to -aptN -1raq. Ope, maya & Postgraduaate

  6. System/360 Computer Assisted Network Scheduling (CANS) System

    Science.gov (United States)

    Brewer, A. C.

    1972-01-01

    Computer assisted scheduling techniques that produce conflict-free and efficient schedules have been developed and implemented to meet needs of the Manned Space Flight Network. CANS system provides effective management of resources in complex scheduling environment. System is automated resource scheduling, controlling, planning, information storage and retrieval tool.

  7. Computational proteomics tools for identification and quality control.

    Science.gov (United States)

    Kopczynski, Dominik; Sickmann, Albert; Ahrends, Robert

    2017-11-10

    Computational proteomics is a constantly growing field to support end users with powerful and reliable tools for performing several computational steps within an analytics workflow for proteomics experiments. Typically, after capturing with a mass spectrometer, the proteins have to be identified and quantified. After certain follow-up analyses, an optional targeted approach is suitable for validating the results. The de.NBI (German network for bioinformatics infrastructure) service center in Dortmund provides several software applications and platforms as services to meet these demands. In this work, we present our tools and services, which is the combination of SearchGUI and PeptideShaker. SearchGUI is a managing tool for several search engines to find peptide spectra matches for one or more complex MS2 measurements. PeptideShaker combines all matches and creates a consensus list of identified proteins providing statistical confidence measures. In a next step, we are planning to release a web service for protein identification containing both tools. This system will be designed for high scalability and distributed computing using solutions like the Docker container system among others. As an additional service, we offer a web service oriented database providing all necessary high-quality and high-resolution data for starting targeted proteomics analyses. The user can easily select proteins of interest, review the according spectra and download both protein sequences and spectral library. All systems are designed to be intuitively and user-friendly operable. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  9. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han

    2016-07-11

    Connectivity and layout of underlying networks largely determine agent behavior and usage in many environments. For example, transportation networks determine the flow of traffic in a neighborhood, whereas building floorplans determine the flow of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications. Such specifications can be in the form of network density, travel time versus network length, traffic type, destination location, etc. We propose an integer programming-based approach that guarantees that the resultant networks are valid by fulfilling all the specified hard constraints and that they score favorably in terms of the objective function. We evaluate our algorithm in two different design settings, street layout and floorplans to demonstrate that diverse networks can emerge purely from high-level functional specifications.

  10. Reliable Interconnection Networks for Parallel Computers

    Science.gov (United States)

    1991-10-01

    AD-A259 498111IIIIIIII il1111 1 111 1 1 1 il i Technical Report 1294 R l a leliable Interconnection Networks for Parallel Computers ELECTE I S .JAN...SUBTITLE S. FUNDING NUMBERS Reliable Interconnection Networks for Parallel Computers N00014-80-C-0622 N00014-85-K-0124 N00014-91-J-1698 6. AUTHOR(S) Larry...are presented. 14. SUBJECT TERMS (key words) IS. NUMBER OF PAGES networks fault tolerance parallel computers 78 reliable routors 16. PRICE CODE

  11. Parallel computing and networking; Heiretsu keisanki to network

    Energy Technology Data Exchange (ETDEWEB)

    Asakawa, E.; Tsuru, T. [Japan National Oil Corp., Tokyo (Japan); Matsuoka, T. [Japan Petroleum Exploration Co. Ltd., Tokyo (Japan)

    1996-05-01

    This paper describes the trend of parallel computers used in geophysical exploration. Around 1993 was the early days when the parallel computers began to be used for geophysical exploration. Classification of these computers those days was mainly MIMD (multiple instruction stream, multiple data stream), SIMD (single instruction stream, multiple data stream) and the like. Parallel computers were publicized in the 1994 meeting of the Geophysical Exploration Society as a `high precision imaging technology`. Concerning the library of parallel computers, there was a shift to PVM (parallel virtual machine) in 1993 and to MPI (message passing interface) in 1995. In addition, the compiler of FORTRAN90 was released with support implemented for data parallel and vector computers. In 1993, networks used were Ethernet, FDDI, CDDI and HIPPI. In 1995, the OC-3 products under ATM began to propagate. However, ATM remains to be an interoffice high speed network because the ATM service has not spread yet for the public network. 1 ref.

  12. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  13. Conceptual metaphors in computer networking terminology ...

    African Journals Online (AJOL)

    Lakoff & Johnson, 1980) is used as a basic framework for analysing and explaining the occurrence of metaphor in the terminology used by computer networking professionals in the information technology (IT) industry. An analysis of linguistic ...

  14. Computer Network Equipment for Intrusion Detection Research

    National Research Council Canada - National Science Library

    Ye, Nong

    2000-01-01

    .... To test the process model, the system-level intrusion detection techniques and the working prototype of the intrusion detection system, a set of computer and network equipment has been purchased...

  15. Risk, Privacy, and Security in Computer Networks

    OpenAIRE

    Årnes, Andre

    2006-01-01

    With an increasingly digitally connected society comes complexity, uncertainty, and risk. Network monitoring, incident management, and digital forensics is of increasing importance with the escalation of cybercrime and other network supported serious crimes. New laws and regulations governing electronic communications, cybercrime, and data retention are being proposed, continuously requiring new methods and tools. This thesis introduces a novel approach to real-time network risk assessmen...

  16. Computational Complexity of Bosons in Linear Networks

    Science.gov (United States)

    2017-03-01

    AFRL-AFOSR-JP-TR-2017-0020 Computational complexity of bosons in linear networks Andrew White THE UNIVERSITY OF QUEENSLAND Final Report 07/27/2016...DATES COVERED (From - To) 02 Mar 2013 to 01 Mar 2016 4. TITLE AND SUBTITLE Computational complexity of bosons in linear networks 5a.  CONTRACT NUMBER 5b...direct exploration of the effect of partial distinguishability in the complexity class of the resulting sampling distribution. Our demultiplexed source

  17. Computer Networks and African Studies Centers.

    Science.gov (United States)

    Kuntz, Patricia S.

    The use of electronic communication in the 12 Title VI African Studies Centers is discussed, and the networks available for their use are reviewed. It is argued that the African Studies Centers should be on the cutting edge of contemporary electronic communication and that computer networks should be a fundamental aspect of their programs. An…

  18. A computer network attack taxonomy and ontology

    CSIR Research Space (South Africa)

    Van Heerden, RP

    2012-01-01

    Full Text Available of attacks, means that an attack could be mitigated accordingly. The authors extend a previous, initial taxonomy of computer network attacks which forms the basis of a proposed network attack ontology in this paper. The objective of this ontology...

  19. Virtual Network Computing Testbed for Cybersecurity Research

    Science.gov (United States)

    2015-08-17

    Standard Form 298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 212-346-1012 W911NF-12-1-0393 61504-CS-RIP.2 Final Report a. REPORT 14. ABSTRACT 16...Technology, 2007. [8] Pullen, J. M., 2000. The network workbench : network simulation software for academic investigation of Internet concepts. Comput

  20. EFFICIENCY METRICS COMPUTING IN COMBINED SENSOR NETWORKS

    OpenAIRE

    Luntovskyy, Andriy; Vasyutynskyy, Volodymyr

    2014-01-01

    This paper discusses the computer-aided design of combined networks for offices and building automation systems based on diverse wired and wireless standards. The design requirements for these networks are often contradictive and have to consider performance, energy and cost efficiency together. For usual office communication, quality of service is more important. In the wireless sensor networks, the energy efficiency is a critical requirement to ensure their long life, to reduce maintenance ...

  1. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    Directory of Open Access Journals (Sweden)

    Sean Cameron Booth

    2013-01-01

    Full Text Available Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before.

  2. COMPUTATIONAL TOOLS FOR THE SECONDARY ANALYSIS OF METABOLOMICS EXPERIMENTS

    Directory of Open Access Journals (Sweden)

    Sean C. Booth

    2013-01-01

    Full Text Available Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before.

  3. Autonomic computing enabled cooperative networked design

    CERN Document Server

    Wodczak, Michal

    2014-01-01

    This book introduces the concept of autonomic computing driven cooperative networked system design from an architectural perspective. As such it leverages and capitalises on the relevant advancements in both the realms of autonomic computing and networking by welding them closely together. In particular, a multi-faceted Autonomic Cooperative System Architectural Model is defined which incorporates the notion of Autonomic Cooperative Behaviour being orchestrated by the Autonomic Cooperative Networking Protocol of a cross-layer nature. The overall proposed solution not only advocates for the inc

  4. Spontaneous ad hoc mobile cloud computing network.

    Science.gov (United States)

    Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes

    2014-01-01

    Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.

  5. Spontaneous Ad Hoc Mobile Cloud Computing Network

    Directory of Open Access Journals (Sweden)

    Raquel Lacuesta

    2014-01-01

    Full Text Available Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.

  6. Algorithms and networking for computer games

    CERN Document Server

    Smed, Jouni

    2006-01-01

    Algorithms and Networking for Computer Games is an essential guide to solving the algorithmic and networking problems of modern commercial computer games, written from the perspective of a computer scientist. Combining algorithmic knowledge and game-related problems, the authors discuss all the common difficulties encountered in game programming. The first part of the book tackles algorithmic problems by presenting how they can be solved practically. As well as ""classical"" topics such as random numbers, tournaments and game trees, the authors focus on how to find a path in, create the terrai

  7. Computer methods in electric network analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saver, P.; Hajj, I.; Pai, M.; Trick, T.

    1983-06-01

    The computational algorithms utilized in power system analysis have more than just a minor overlap with those used in electronic circuit computer aided design. This paper describes the computer methods that are common to both areas and highlights the differences in application through brief examples. Recognizing this commonality has stimulated the exchange of useful techniques in both areas and has the potential of fostering new approaches to electric network analysis through the interchange of ideas.

  8. Computer network time synchronization the network time protocol

    CERN Document Server

    Mills, David L

    2006-01-01

    What started with the sundial has, thus far, been refined to a level of precision based on atomic resonance: Time. Our obsession with time is evident in this continued scaling down to nanosecond resolution and beyond. But this obsession is not without warrant. Precision and time synchronization are critical in many applications, such as air traffic control and stock trading, and pose complex and important challenges in modern information networks.Penned by David L. Mills, the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol

  9. Social networks a framework of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2014-01-01

    This volume provides the audience with an updated, in-depth and highly coherent material on the conceptually appealing and practically sound information technology of Computational Intelligence applied to the analysis, synthesis and evaluation of social networks. The volume involves studies devoted to key issues of social networks including community structure detection in networks, online social networks, knowledge growth and evaluation, and diversity of collaboration mechanisms.  The book engages a wealth of methods of Computational Intelligence along with well-known techniques of linear programming, Formal Concept Analysis, machine learning, and agent modeling.  Human-centricity is of paramount relevance and this facet manifests in many ways including personalized semantics, trust metric, and personal knowledge management; just to highlight a few of these aspects. The contributors to this volume report on various essential applications including cyber attacks detection, building enterprise social network...

  10. VISTA - computational tools for comparative genomics

    Energy Technology Data Exchange (ETDEWEB)

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  11. Professional networking using computer-mediated communication.

    Science.gov (United States)

    Washer, Peter

    Traditionally, professionals have networked with others in their field through attending conferences, professional organizations, direct mailing, and via the workplace. Recently, there have been new possibilities to network with other professionals using the internet. This article looks at the possibilities that the internet offers for professional networking, particularly e-mailing lists, newsgroups and membership databases, and compares them against more traditional methods of professional networking. The different types of computer-mediated communication are discussed and their relative merits and disadvantages are examined. The benefits and potential pitfalls of internet professional networking, as it relates to the nursing profession, are examined. Practical advice is offered on how the internet can be used as a means to foster professional networks of academic, clinical or research interests.

  12. Natural computing for vehicular networks

    OpenAIRE

    Toutouh El Alamin, Jamal

    2016-01-01

    La presente tesis aborda el diseño inteligente de soluciones para el despliegue de redes vehiculares ad-hoc (vehicular ad hoc networks, VANETs). Estas son redes de comunicación inalámbrica formada principalmente por vehículos y elementos de infraestructura vial. Las VANETs ofrecen la oportunidad para desarrollar aplicaciones revolucionarias en el ámbito de la seguridad y eficiencia vial. Al ser un dominio tan novedoso, existe una serie de cuestiones abiertas, como el diseño de la infraestruct...

  13. Biological networks 101: computational modeling for molecular biologists.

    Science.gov (United States)

    Scholma, Jetse; Schivo, Stefano; Urquidi Camacho, Ricardo A; van de Pol, Jaco; Karperien, Marcel; Post, Janine N

    2014-01-01

    Computational modeling of biological networks permits the comprehensive analysis of cells and tissues to define molecular phenotypes and novel hypotheses. Although a large number of software tools have been developed, the versatility of these tools is limited by mathematical complexities that prevent their broad adoption and effective use by molecular biologists. This study clarifies the basic aspects of molecular modeling, how to convert data into useful input, as well as the number of time points and molecular parameters that should be considered for molecular regulatory models with both explanatory and predictive potential. We illustrate the necessary experimental preconditions for converting data into a computational model of network dynamics. This model requires neither a thorough background in mathematics nor precise data on intracellular concentrations, binding affinities or reaction kinetics. Finally, we show how an interactive model of crosstalk between signal transduction pathways in primary human articular chondrocytes allows insight into processes that regulate gene expression. © 2013 Elsevier B.V. All rights reserved.

  14. Implementing Handheld Computers as Tools for First-Grade Writers

    Science.gov (United States)

    Kuhlman, Wilma D.; Danielson, Kathy Everts; Campbell, Elizabeth J.; Topp, Neal W.

    2006-01-01

    All humans use objects in their environment as tools for actions. Some tools are more useful than others for certain people and populations. This paper describes how different first-graders used handheld computers as tools when writing. While all 17 children in the observed classroom were competent users of their handheld computers, their use of…

  15. Integration of a network aware traffic generation device into a computer network emulation platform

    CSIR Research Space (South Africa)

    Von Solms, S

    2014-07-01

    Full Text Available Flexible, open source network emulation tools can provide network researchers with significant benefits regarding network behaviour and performance. The evaluation of these networks can benefit greatly from the integration of realistic, network...

  16. International Symposium on Computing and Network Sustainability

    CERN Document Server

    Akashe, Shyam

    2017-01-01

    The book is compilation of technical papers presented at International Research Symposium on Computing and Network Sustainability (IRSCNS 2016) held in Goa, India on 1st and 2nd July 2016. The areas covered in the book are sustainable computing and security, sustainable systems and technologies, sustainable methodologies and applications, sustainable networks applications and solutions, user-centered services and systems and mobile data management. The novel and recent technologies presented in the book are going to be helpful for researchers and industries in their advanced works.

  17. Passive Fingerprinting Of Computer Network Reconnaissance Tools

    Science.gov (United States)

    2009-09-01

    Fingerprint Summary When reviewing the different data captures for analysis, it was noted that one of the early default Nmap captures had...INTENTIONALLY LEFT BLANK xi LIST OF TABLES Table 1. Nmap (root) Fingerprint Summary... Fingerprint Summary ..................................51 Table 8. CDX Probable Nmap Scan Summary

  18. On computer vision in wireless sensor networks.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Ko, Teresa H.

    2004-09-01

    Wireless sensor networks allow detailed sensing of otherwise unknown and inaccessible environments. While it would be beneficial to include cameras in a wireless sensor network because images are so rich in information, the power cost of transmitting an image across the wireless network can dramatically shorten the lifespan of the sensor nodes. This paper describe a new paradigm for the incorporation of imaging into wireless networks. Rather than focusing on transmitting images across the network, we show how an image can be processed locally for key features using simple detectors. Contrasted with traditional event detection systems that trigger an image capture, this enables a new class of sensors which uses a low power imaging sensor to detect a variety of visual cues. Sharing these features among relevant nodes cues specific actions to better provide information about the environment. We report on various existing techniques developed for traditional computer vision research which can aid in this work.

  19. Distributed Denial of Service Tools, Trin00, Tribe Flood Network, Tribe Flood Network 2000 and Stacheldraht.

    Energy Technology Data Exchange (ETDEWEB)

    Criscuolo, P. J.

    2000-02-14

    One type of attack on computer systems is know as a Denial of Service (DoS) attack. A DoS attack is designed to prevent legitimate users from using a system. Traditional Denial of Service attacks are done by exploiting a buffer overflow, exhausting system resources, or exploiting a system bug that results in a system that is no longer functional. In the summer of 1999, a new breed of attack has been developed called Distributed Denial of Service (DDoS) attack. Several educational and high capacity commercial sites have been affected by these DDoS attacks. A DDoS attack uses multiple machines operating in concert to attack a network or site. There is very little that can be done if you are the target of a DDoS. The nature of these attacks cause so much extra network traffic that it is difficult for legitimate traffic to reach your site while blocking the forged attacking packets. The intent of this paper is to help sites not be involved in a DDoS attack. The first tools developed to perpetrate the DDoS attack were Trin00 and Tribe Flood Network (TFN). They spawned the next generation of tools called Tribe Flood Network 2000 (TFN2K) and Stacheldraht (German for Barb Wire). These DDoS attack tools are designed to bring one or more sites down by flooding the victim with large amounts of network traffic originating at multiple locations and remotely controlled by a single client. This paper discusses how these DDoS tools work, how to detect them, and specific technical information on each individual tool. It is written with the system administrator in mind. It assumes that the reader has basic knowledge of the TCP/IP Protocol.

  20. Computation, cryptography, and network security

    CERN Document Server

    Rassias, Michael

    2015-01-01

    Analysis, assessment, and data management are core competencies for operation research analysts. This volume addresses a number of issues and developed methods for improving those skills. It is an outgrowth of a conference held in April 2013 at the Hellenic Military Academy, and brings together a broad variety of mathematical methods and theories with several applications. It discusses directions and pursuits of scientists that pertain to engineering sciences. It is also presents the theoretical background required for algorithms and techniques applied to a large variety of concrete problems. A number of open questions as well as new future areas are also highlighted.   This book will appeal to operations research analysts, engineers, community decision makers, academics, the military community, practitioners sharing the current “state-of-the-art,” and analysts from coalition partners. Topics covered include Operations Research, Games and Control Theory, Computational Number Theory and Information Securi...

  1. AASERT: Software Tools for Experimentation in Computational Geometry

    National Research Council Canada - National Science Library

    Dobkin, David

    2001-01-01

    This research has considered problems in computer graphics and visualization. The work has aimed to bring theoretical tools to practical problems as well as to develop tools with which to aid in the building of geometric software...

  2. Student Motivation in Computer Networking Courses

    Directory of Open Access Journals (Sweden)

    Wen-Jung Hsin

    2007-01-01

    Full Text Available This paper introduces several hands-on projects that have been used to motivate students in learning various computer networking concepts. These projects are shown to be very useful and applicable to the learners’ daily tasks and activities such as emailing, Web browsing, and online shopping and banking, and lead to an unexpected byproduct, self-motivation.

  3. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  4. Student Motivation in Computer Networking Courses

    Directory of Open Access Journals (Sweden)

    Wen-Jung Hsin, PhD

    2007-08-01

    Full Text Available This paper introduces several hands-on projects that have been used to motivate students in learning various computer networking concepts. These projects are shown to be very useful and applicable to the learners’ daily tasks and activities such as emailing, Web browsing, and online shopping and banking, and lead to an unexpected byproduct, self-motivation.

  5. Non-harmful insertion of data mimicking computer network attacks

    Energy Technology Data Exchange (ETDEWEB)

    Neil, Joshua Charles; Kent, Alexander; Hash, Jr, Curtis Lee

    2016-06-21

    Non-harmful data mimicking computer network attacks may be inserted in a computer network. Anomalous real network connections may be generated between a plurality of computing systems in the network. Data mimicking an attack may also be generated. The generated data may be transmitted between the plurality of computing systems using the real network connections and measured to determine whether an attack is detected.

  6. [Renewal of NIHS computer network system].

    Science.gov (United States)

    Segawa, Katsunori; Nakano, Tatsuya; Saito, Yoshiro

    2012-01-01

    Updated version of National Institute of Health Sciences Computer Network System (NIHS-NET) is described. In order to reduce its electric power consumption, the main server system was newly built using the virtual machine technology. The service that each machine provided in the previous network system should be maintained as much as possible. Thus, the individual server was constructed for each service, because a virtual server often show decrement in its performance as compared with a physical server. As a result, though the number of virtual servers was increased and the network communication became complicated among the servers, the conventional service was able to be maintained, and security level was able to be rather improved, along with saving electrical powers. The updated NIHS-NET bears multiple security countermeasures. To maximal use of these measures, awareness for the network security by all users is expected.

  7. Computer-Supported Modelling of Multi modal Transportation Networks Rationalization

    Directory of Open Access Journals (Sweden)

    Ratko Zelenika

    2007-09-01

    Full Text Available This paper deals with issues of shaping and functioning ofcomputer programs in the modelling and solving of multimoda Itransportation network problems. A methodology of an integrateduse of a programming language for mathematical modellingis defined, as well as spreadsheets for the solving of complexmultimodal transportation network problems. The papercontains a comparison of the partial and integral methods ofsolving multimodal transportation networks. The basic hypothesisset forth in this paper is that the integral method results inbetter multimodal transportation network rationalization effects,whereas a multimodal transportation network modelbased on the integral method, once built, can be used as the basisfor all kinds of transportation problems within multimodaltransport. As opposed to linear transport problems, multimodaltransport network can assume very complex shapes. This papercontains a comparison of the partial and integral approach totransp01tation network solving. In the partial approach, astraightforward model of a transp01tation network, which canbe solved through the use of the Solver computer tool within theExcel spreadsheet inteiface, is quite sufficient. In the solving ofa multimodal transportation problem through the integralmethod, it is necessmy to apply sophisticated mathematicalmodelling programming languages which supp01t the use ofcomplex matrix functions and the processing of a vast amountof variables and limitations. The LINGO programming languageis more abstract than the Excel spreadsheet, and it requiresa certain programming knowledge. The definition andpresentation of a problem logic within Excel, in a manner whichis acceptable to computer software, is an ideal basis for modellingin the LINGO programming language, as well as a fasterand more effective implementation of the mathematical model.This paper provides proof for the fact that it is more rational tosolve the problem of multimodal transportation networks by

  8. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  9. Integration of computer tools in lessons about motion and dynamics

    OpenAIRE

    Klinec, Dimitrij

    2016-01-01

    The diploma work is about computer tools used in physics classes and their role in teaching natural science. Due to the possibilities enabled by the multi-presentation, computer programmes with interactive simulations and programmes with mechanical equipment for computer-based experiments as well as video analyses have been put forward in natural science classes thus becoming an indispensable part of the natural science classrooms. In diploma work, computer simulations and computer-based e...

  10. GraphCrunch 2: Software tool for network modeling, alignment and clustering.

    Science.gov (United States)

    Kuchaiev, Oleksii; Stevanović, Aleksandar; Hayes, Wayne; Pržulj, Nataša

    2011-01-19

    Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, GraphCruch 2 implements an

  11. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  12. Fuzzy logic, neural networks, and soft computing

    Science.gov (United States)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial

  13. Spiking network simulation code for petascale computers

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  14. Spiking network simulation code for petascale computers.

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  15. International Symposium on Complex Computing-Networks

    CERN Document Server

    Sevgi, L; CCN2005; Complex computing networks: Brain-like and wave-oriented electrodynamic algorithms

    2006-01-01

    This book uniquely combines new advances in the electromagnetic and the circuits&systems theory. It integrates both fields regarding computational aspects of common interest. Emphasized subjects are those methods which mimic brain-like and electrodynamic behaviour; among these are cellular neural networks, chaos and chaotic dynamics, attractor-based computation and stream ciphers. The book contains carefully selected contributions from the Symposium CCN2005. Pictures from the bestowal of Honorary Doctorate degrees to Leon O. Chua and Leopold B. Felsen are included.

  16. Fast computation of minimum hybridization networks.

    Science.gov (United States)

    Albrecht, Benjamin; Scornavacca, Celine; Cenci, Alberto; Huson, Daniel H

    2012-01-15

    Hybridization events in evolution may lead to incongruent gene trees. One approach to determining possible interspecific hybridization events is to compute a hybridization network that attempts to reconcile incongruent gene trees using a minimum number of hybridization events. We describe how to compute a representative set of minimum hybridization networks for two given bifurcating input trees, using a parallel algorithm and provide a user-friendly implementation. A simulation study suggests that our program performs significantly better than existing software on biologically relevant data. Finally, we demonstrate the application of such methods in the context of the evolution of the Aegilops/Triticum genera. The algorithm is implemented in the program Dendroscope 3, which is freely available from www.dendroscope.org and runs on all three major operating systems.

  17. Integrating Wireless Sensor Networks with Computational Grids

    Science.gov (United States)

    Preve, Nikolaos

    Wireless sensor networks (WSNs) have been greatly developed and emerged their significance in a wide range of important applications such as ac quisition and process in formation from the physical world. The evolvement of Grid computing has been based on coordination of distributed and shared re sources. A Sensor Grid network can integrate these two leading technologies enabling real-time sensor data collection, the sharing of computational and stor age grid resources for sensor data processing and management. Several issues have occurred from this integration which dispute the modern design of sensor grids. In order to address these issues, in this paper we propose a sensor grid ar chitecture supporting it by a testbed which focuses on the design issues and on the improvement of our sensor grid architecture design.

  18. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    Energy Technology Data Exchange (ETDEWEB)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needs to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.

  19. Computer network defense through radial wave functions

    Science.gov (United States)

    Malloy, Ian J.

    The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.

  20. The research of computer network security and protection strategy

    Science.gov (United States)

    He, Jian

    2017-05-01

    With the widespread popularity of computer network applications, its security is also received a high degree of attention. Factors affecting the safety of network is complex, for to do a good job of network security is a systematic work, has the high challenge. For safety and reliability problems of computer network system, this paper combined with practical work experience, from the threat of network security, security technology, network some Suggestions and measures for the system design principle, in order to make the masses of users in computer networks to enhance safety awareness and master certain network security technology.

  1. Trends in tools; Computers in Libraries

    NARCIS (Netherlands)

    G.J.M. Bierens

    2007-01-01

    In april '07 vond in de VS het congres 'Computers in Libraries' plaats. Gerard Bierens en Liesbeth Mantel signaleren de opvallendste trends. 'De bibliotheekcatalogus gaat onherroepelijk onder het mes.'

  2. Using satellite communications for a mobile computer network

    Science.gov (United States)

    Wyman, Douglas J.

    1993-01-01

    The topics discussed include the following: patrol car automation, mobile computer network, network requirements, network design overview, MCN mobile network software, MCN hub operation, mobile satellite software, hub satellite software, the benefits of patrol car automation, the benefits of satellite mobile computing, and national law enforcement satellite.

  3. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  4. Design and implementation of a local computer network

    Energy Technology Data Exchange (ETDEWEB)

    Fortune, P. J.; Lidinsky, W. P.; Zelle, B. R.

    1977-01-01

    An intralaboratory computer communications network was designed and is being implemented at Argonne National Laboratory. Parameters which were considered to be important in the network design are discussed; and the network, including its hardware and software components, is described. A discussion of the relationship between computer networks and distributed processing systems is also presented. The problems which the network is designed to solve and the consequent network structure represent considerations which are of general interest. 5 figures.

  5. MCF: a tool to find multi-scale community profiles in biological networks.

    Science.gov (United States)

    Gao, Shang; Chen, Alan; Rahmani, Ali; Jarada, Tamer; Alhajj, Reda; Demetrick, Doug; Zeng, Jia

    2013-12-01

    Recent developments of complex graph clustering methods have implicated the practical applications with biological networks in different settings. Multi-scale Community Finder (MCF) is a tool to profile network communities (i.e., clusters of nodes) with the control of community sizes. The controlling parameter is referred to as the scale of the network community profile. MCF is able to find communities in all major types of networks including directed, signed, bipartite, and multi-slice networks. The fast computation promotes the practicability of the tool for large-scaled analysis (e.g., protein-protein interaction and gene co-expression networks). MCF is distributed as an open-source C++ package for academic use with both command line and user interface options, and can be downloaded at http://bsdxd.cpsc.ucalgary.ca/MCF. Detailed user manual and sample data sets are also available at the project website. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Computational Design Tools for Integrated Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  7. Computational Tools for Stem Cell Biology.

    Science.gov (United States)

    Bian, Qin; Cahan, Patrick

    2016-12-01

    For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the past several years, a new subdiscipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Tools for Performance Evaluation of Computer Systems: Historical Evolution and Perspectives

    OpenAIRE

    Casale, Giuliano; Gribaudo, Marco; Serazzi, Giuseppe

    2010-01-01

    Part 1: Milestones and Evolutions; International audience; The development of software tools for performance evaluation and modeling has been an active research area since the early years of computer science. In this paper, we offer a short overview of historical evolution of the field with an emphasis on popular performance modeling techniques such as queuing networks and Petri nets. A review of recent works that provide new perspectives to software tools for performance modeling is presente...

  9. GKIN: a tool for drawing genetic networks

    Directory of Open Access Journals (Sweden)

    Jonathan Arnold

    2012-03-01

    Full Text Available We present GKIN, a simulator and a comprehensive graphical interface where one can draw the model specification of reactions between hypothesized molecular participants in a gene regulatory and biochemical reaction network (or genetic network for short. The solver is written in C++ in a nearly platform independentmanner to simulate large ensembles of models, which can run on PCs, Macintoshes, and UNIX machines, and its graphical user interface is written in Java which can run as a standalone or WebStart application. The drawing capability for rendering a network significantly enhances the ease of use of other reaction network simulators, such as KINSOLVER (Aleman-Meza et al., 2009 and enforces a correct semantic specification of the network. In a usability study with novice users, drawing the network with GKIN was preferred and faster in comparison with entry with a dialog-box guided interface in COPASI (Hoops, et al., 2006 with no difference in error rates between GKIN and COPASI in specifying the network. GKIN is freely available at http://faculty.cs.wit.edu/~ldeligia/PROJECTS/GKIN/.

  10. Computational fluid dynamics: science and tool

    NARCIS (Netherlands)

    B. Koren (Barry)

    2006-01-01

    textabstractThe year 2003 marked the 100th anniversary of both the birth of John von Neumann and the first manned flight with a power plane. In the current paper, from a Dutch perspective, attention is paid to the great importance of both events for computational fluid dynamics in general and

  11. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  12. A Computational Investigation of Cohesion and Lexical Network Density in L2 Writing

    Science.gov (United States)

    Green, Clarence

    2012-01-01

    This study used a new computational linguistics tool, the Coh-Metrix, to investigate and measure the differences in cohesion and lexical network density between native speaker and non-native speaker writing, as well as to investigate L2 proficiency level differences in cohesion and lexical network density. This study analyzed data from three…

  13. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  14. Computational capabilities of graph neural networks.

    Science.gov (United States)

    Scarselli, Franco; Gori, Marco; Tsoi, Ah Chung; Hagenbuchner, Markus; Monfardini, Gabriele

    2009-01-01

    In this paper, we will consider the approximation properties of a recently introduced neural network model called graph neural network (GNN), which can be used to process-structured data inputs, e.g., acyclic graphs, cyclic graphs, and directed or undirected graphs. This class of neural networks implements a function tau(G,n) is an element of IR(m) that maps a graph G and one of its nodes n onto an m-dimensional Euclidean space. We characterize the functions that can be approximated by GNNs, in probability, up to any prescribed degree of precision. This set contains the maps that satisfy a property called preservation of the unfolding equivalence, and includes most of the practically useful functions on graphs; the only known exception is when the input graph contains particular patterns of symmetries when unfolding equivalence may not be preserved. The result can be considered an extension of the universal approximation property established for the classic feedforward neural networks (FNNs). Some experimental examples are used to show the computational capabilities of the proposed model.

  15. Perancangan Network Monitoring Tools Menggunakan Autonomous Agent Java

    Directory of Open Access Journals (Sweden)

    Khurniawan Eko S

    2016-08-01

    Full Text Available Tugas pengelolaan jaringan yang dilakukan administrator jaringan diantaranya yaitu pengumpulan informasi resource jaringan yang tersedia. Teknologi SNMP (Simple Network Management Protocol memberikan fleksibilitas bagi administrator jaringan dalam mengatur network secara keseluruhan dari satu lokasi. Aplikasi Network Monitoring Tools berbasis Agent JAVA terdiri dari Master agent yang bertugas untuk melakukan management Request agent serta akses database. Request agent yang bertugas untuk melakukan pemantauan server yang mengimplementasi library SNMP4j dengan sistem multi-agent. Disisi interface, aplikasi Network Monitoring Tools menggunakan media web sebagai interface administrator sehingga dapat digunakan darimana saja  dan kapan saja.  Hasil dari penelitian ini memperlihatkan bahwa aplikasi yang dibuat bekerja sebagai Network Monitoring Tools mampu bekerja dengan persen error pada kisaran 0-18%. Selain itu Aplikasi ini menghasilkan tren pembacaan data server lebih stabil dan cepat dibandingkan dengan aplikasi Cacti. Hal ini didukung oleh kemampuan Request Agent yang mampu merespon tingkat beban kerja server yang di pantau.

  16. A Computational Tool for Helicopter Rotor Noise Prediction Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR project proposes to develop a computational tool for helicopter rotor noise prediction based on hybrid Cartesian grid/gridless approach. The uniqueness of...

  17. Computer Aided Design Tools for Extreme Environment Electronics Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  18. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  19. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  20. Computational Tools and Studies of Graphene Nanostructures

    DEFF Research Database (Denmark)

    Papior, Nick Rübner

    Nano-electronics industry has during the past decade decreased feature sizes to roughly 10nm. Such feature sizes are at the quantum limit, requiring a description at the quantum mechanical level. Parallel to the experimental work reside the theoretical tools used to investigate and understand...... require revised algorithms. Furthermore, the advent of 2D materials may prove prominent in future nanoelectronics for electronic and heat transport devices. Such materials include the Nobel Prize winning material, graphene which has unique properties. The main focus of the work presented in this thesis...

  1. Mapping, Awareness, And Virtualization Network Administrator Training Tool Virtualization Module

    Science.gov (United States)

    2016-03-01

    AND VIRTUALIZATION NETWORK ADMINISTRATOR TRAINING TOOL VIRTUALIZATION MODULE by Erik W. Berndt March 2016 Thesis Advisor: John Gibson...REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MAPPING, AWARENESS, AND VIRTUALIZATION NETWORK ADMINISTRATOR TRAINING TOOL... VIRTUALIZATION MODULE 5. FUNDING NUMBERS 6. AUTHOR(S) Erik W. Berndt 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School

  2. COMPUTATIONAL TOOLS FOR RATIONAL PROTEIN ENGINEERING OF ALDOLASES

    OpenAIRE

    Widmann, Michael; Pleiss, Jürgen; Samland, Anne K.

    2012-01-01

    In this mini-review we describe the different strategies for rational protein engineering and summarize the computational tools available. Computational tools can either be used to design focused libraries, to predict sequence-function relationships or for structure-based molecular modelling. This also includes de novo design of enzymes. Examples for protein engineering of aldolases and transaldolases are given in the second part of the mini-review.

  3. Computational tools for rational protein engineering of aldolases

    Directory of Open Access Journals (Sweden)

    Michael Widmann

    2012-09-01

    Full Text Available In this mini-review we describe the different strategies for rational protein engineering and summarize the computational tools available. Computational tools can either be used to design focused libraries, to predict sequence-function relationships or for structure-based molecular modelling. This also includes de novo design of enzymes. Examples for protein engineering of aldolases and transaldolases are given in the second part of the mini-review.

  4. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  5. Computer network security and cyber ethics

    CERN Document Server

    Kizza, Joseph Migga

    2014-01-01

    In its 4th edition, this book remains focused on increasing public awareness of the nature and motives of cyber vandalism and cybercriminals, the weaknesses inherent in cyberspace infrastructure, and the means available to protect ourselves and our society. This new edition aims to integrate security education and awareness with discussions of morality and ethics. The reader will gain an understanding of how the security of information in general and of computer networks in particular, on which our national critical infrastructure and, indeed, our lives depend, is based squarely on the individ

  6. Some queuing network models of computer systems

    Science.gov (United States)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  7. RADYBAN: A tool for reliability analysis of dynamic fault trees through conversion into dynamic Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Montani, S. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: stefania@mfn.unipmn.it; Portinale, L. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: portinal@mfn.unipmn.it; Bobbio, A. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: bobbio@mfn.unipmn.it; Codetta-Raiteri, D. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: raiteri@mfn.unipmn.it

    2008-07-15

    In this paper, we present RADYBAN (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze a dynamic fault tree relying on its conversion into a dynamic Bayesian network. The tool implements a modular algorithm for automatically translating a dynamic fault tree into the corresponding dynamic Bayesian network and exploits classical algorithms for the inference on dynamic Bayesian networks, in order to compute reliability measures. After having described the basic features of the tool, we show how it operates on a real world example and we compare the unreliability results it generates with those returned by other methodologies, in order to verify the correctness and the consistency of the results obtained.

  8. xMWAS: a data-driven integration and differential network analysis tool.

    Science.gov (United States)

    Uppal, Karan; Ma, Chunyu; Go, Young-Mi; Jones, Dean P; Wren, Jonathan

    2018-02-15

    Integrative omics is a central component of most systems biology studies. Computational methods are required for extracting meaningful relationships across different omics layers. Various tools have been developed to facilitate integration of paired heterogenous omics data; however most existing tools allow integration of only two omics datasets. Furthermore, existing data integration tools do not incorporate additional steps of identifying sub-networks or communities of highly connected entities and evaluating the topology of the integrative network under different conditions. Here we present xMWAS, a software for data integration, network visualization, clustering, and differential network analysis of data from biochemical and phenotypic assays, and two or more omics platforms. https://kuppal.shinyapps.io/xmwas (Online) and https://github.com/kuppal2/xMWAS/ (R). kuppal2@emory.edu. Supplementary data are available at Bioinformatics online.

  9. Applications of computational tools in biosciences and medical engineering

    CERN Document Server

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  10. WEB BASED LEARNING OF COMPUTER NETWORK COURSE

    Directory of Open Access Journals (Sweden)

    Hakan KAPTAN

    2004-04-01

    Full Text Available As a result of developing on Internet and computer fields, web based education becomes one of the area that many improving and research studies are done. In this study, web based education materials have been explained for multimedia animation and simulation aided Computer Networks course in Technical Education Faculties. Course content is formed by use of university course books, web based education materials and technology web pages of companies. Course content is formed by texts, pictures and figures to increase motivation of students and facilities of learning some topics are supported by animations. Furthermore to help working principles of routing algorithms and congestion control algorithms simulators are constructed in order to interactive learning

  11. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    Science.gov (United States)

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  12. Water Network Tool for Resilience (WNTR) User Manual -

    Science.gov (United States)

    The Water Network Tool for Resilience (WNTR) is a new Python package designed to simulate and analyze resilience of water distribution networks to a variety of disaster scenarios. WNTR can help water utilities to explore the capacity of their systems to handle disasters and gui...

  13. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  14. On Computational Fluid Dynamics Tools in Architectural Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Hougaard, Mads; Stærdahl, Jesper Winther

    In spite of being apparently easy to use, computational fluid dynamics (CFD) based tools require specialist knowledge for modeling as well as for the interpretation of results. This point of view implies also that users of CFD based tools have to be carefully choosing and using them. Especially w...

  15. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  16. A GIS Tool for simulating Nitrogen transport along schematic Network

    Science.gov (United States)

    Tavakoly, A. A.; Maidment, D. R.; Yang, Z.; Whiteaker, T.; David, C. H.; Johnson, S.

    2012-12-01

    An automated method called the Arc Hydro Schematic Processor has been developed for water process computation on schematic networks formed from the NHDPlus and similar GIS river networks. The sechemtaic network represents the hydrologic feature on the ground and is a network of links and nodes. SchemaNodes show hydrologic features, such as catchments or stream junctions. SchemaLinks prescripe the connections between nodes. The schematic processor uses the schematic network to pass informatin through a watershed and move water or pollutants dwonstream. In addition, the schematic processor has a capability to use additional programming applied to the passed and/or received values and manipulating data trough network. This paper describes how the schemtic processor can be used to simulate nitrogen transport and transformation on river networks. For this purpose the nitrogen loads is estimated on the NHDPlus river network using the Schematic Processor coupled with the river routing model for the Texas Gulf Coast Hydrologic Region.

  17. Distinguishing humans from computers in the game of go: A complex network approach

    Science.gov (United States)

    Coquidé, C.; Georgeot, B.; Giraud, O.

    2017-08-01

    We compare complex networks built from the game of go and obtained from databases of human-played games with those obtained from computer-played games. Our investigations show that statistical features of the human-based networks and the computer-based networks differ, and that these differences can be statistically significant on a relatively small number of games using specific estimators. We show that the deterministic or stochastic nature of the computer algorithm playing the game can also be distinguished from these quantities. This can be seen as a tool to implement a Turing-like test for go simulators.

  18. The Use of Computer Tools to Support Meaningful Learning

    Science.gov (United States)

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    This article attempts to provide a review of literature pertaining to computer technology use in education. The authors discuss the benefits of learning with technology tools when integrated into teaching. The argument that introducing computer technology into schools will neither improve nor change the quality of classroom instruction unless…

  19. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  20. Social network diagnostics: a tool for monitoring group interventions.

    Science.gov (United States)

    Gesell, Sabina B; Barkin, Shari L; Valente, Thomas W

    2013-10-01

    Many behavioral interventions designed to improve health outcomes are delivered in group settings. To date, however, group interventions have not been evaluated to determine if the groups generate interaction among members and how changes in group interaction may affect program outcomes at the individual or group level. This article presents a model and practical tool for monitoring how social ties and social structure are changing within the group during program implementation. The approach is based on social network analysis and has two phases: collecting network measurements at strategic intervention points to determine if group dynamics are evolving in ways anticipated by the intervention, and providing the results back to the group leader to guide implementation next steps. This process aims to initially increase network connectivity and ultimately accelerate the diffusion of desirable behaviors through the new network. This article presents the Social Network Diagnostic Tool and, as proof of concept, pilot data collected during the formative phase of a childhood obesity intervention. The number of reported advice partners and discussion partners increased during program implementation. Density, the number of ties among people in the network expressed as a percentage of all possible ties, increased from 0.082 to 0.182 (p 0.05) in the discussion network. The observed two-fold increase in network density represents a significant shift in advice partners over the intervention period. Using the Social Network Tool to empirically guide program activities of an obesity intervention was feasible.

  1. Choice Of Computer Networking Cables And Their Effect On Data ...

    African Journals Online (AJOL)

    Computer networking is the order of the day in this Information and Communication Technology (ICT) age. Although a network can be through a wireless device most local connections are done using cables. There are three main computer-networking cables namely coaxial cable, unshielded twisted pair cable and the optic ...

  2. Computational Aspects of Sensor Network Protocols (Distributed Sensor Network Simulator

    Directory of Open Access Journals (Sweden)

    Vasanth Iyer

    2009-08-01

    Full Text Available In this work, we model the sensor networks as an unsupervised learning and clustering process. We classify nodes according to its static distribution to form known class densities (CCPD. These densities are chosen from specific cross-layer features which maximizes lifetime of power-aware routing algorithms. To circumvent computational complexities of a power-ware communication STACK we introduce path-loss models at the nodes only for high density deployments. We study the cluster heads and formulate the data handling capacity for an expected deployment and use localized probability models to fuse the data with its side information before transmission. So each cluster head has a unique Pmax but not all cluster heads have the same measured value. In a lossless mode if there are no faults in the sensor network then we can show that the highest probability given by Pmax is ambiguous if its frequency is ≤ n/2 otherwise it can be determined by a local function. We further show that the event detection at the cluster heads can be modelled with a pattern 2m and m, the number of bits can be a correlated pattern of 2 bits and for a tight lower bound we use 3-bit Huffman codes which have entropy < 1. These local algorithms are further studied to optimize on power, fault detection and to maximize on the distributed routing algorithm used at the higher layers. From these bounds in large network, it is observed that the power dissipation is network size invariant. The performance of the routing algorithms solely based on success of finding healthy nodes in a large distribution. It is also observed that if the network size is kept constant and the density of the nodes is kept closer then the local pathloss model effects the performance of the routing algorithms. We also obtain the maximum intensity of transmitting nodes for a given category of routing algorithms for an outage constraint, i.e., the lifetime of sensor network.

  3. On Distributed Computation in Noisy Random Planar Networks

    OpenAIRE

    Kanoria, Y.; Manjunath, D.

    2007-01-01

    We consider distributed computation of functions of distributed data in random planar networks with noisy wireless links. We present a new algorithm for computation of the maximum value which is order optimal in the number of transmissions and computation time.We also adapt the histogram computation algorithm of Ying et al to make the histogram computation time optimal.

  4. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  5. Chemical Reaction Networks for Computing Polynomials.

    Science.gov (United States)

    Salehi, Sayed Ahmad; Parhi, Keshab K; Riedel, Marc D

    2017-01-20

    Chemical reaction networks (CRNs) provide a fundamental model in the study of molecular systems. Widely used as formalism for the analysis of chemical and biochemical systems, CRNs have received renewed attention as a model for molecular computation. This paper demonstrates that, with a new encoding, CRNs can compute any set of polynomial functions subject only to the limitation that these functions must map the unit interval to itself. These polynomials can be expressed as linear combinations of Bernstein basis polynomials with positive coefficients less than or equal to 1. In the proposed encoding approach, each variable is represented using two molecular types: a type-0 and a type-1. The value is the ratio of the concentration of type-1 molecules to the sum of the concentrations of type-0 and type-1 molecules. The proposed encoding naturally exploits the expansion of a power-form polynomial into a Bernstein polynomial. Molecular encoders for converting any input in a standard representation to the fractional representation as well as decoders for converting the computed output from the fractional to a standard representation are presented. The method is illustrated first for generic CRNs; then chemical reactions designed for an example are mapped to DNA strand-displacement reactions.

  6. Discerning molecular interactions: A comprehensive review on biomolecular interaction databases and network analysis tools.

    Science.gov (United States)

    Miryala, Sravan Kumar; Anbarasu, Anand; Ramaiah, Sudha

    2017-11-09

    Computational analysis of biomolecular interaction networks is now gaining a lot of importance to understand the functions of novel genes/proteins. Gene interaction (GI) network analysis and protein-protein interaction (PPI) network analysis play a major role in predicting the functionality of interacting genes or proteins and gives an insight into the functional relationships and evolutionary conservation of interactions among the genes. An interaction network is a graphical representation of gene/protein interactome, where each gene/protein is a node, and interaction between gene/protein is an edge. In this review, we discuss the popular open source databases that serve as data repositories to search and collect protein/gene interaction data, and also tools available for the generation of interaction network, visualization and network analysis. Also, various network analysis approaches like topological approach and clustering approach to study the network properties and functional enrichment server which illustrates the functions and pathway of the genes and proteins has been discussed. Hence the distinctive attribute mentioned in this review is not only to provide an overview of tools and web servers for gene and protein-protein interaction (PPI) network analysis but also to extract useful and meaningful information from the interaction networks. Copyright © 2017. Published by Elsevier B.V.

  7. Computer tools for systems engineering at LaRC

    Science.gov (United States)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  8. Navy Network Dependability: Models, Metrics, and Tools

    Science.gov (United States)

    2010-01-01

    surveillance; TCDL = Tactical Common Data Link; UHF = ultra high frequency; UFO = ultra-high-frequency follow-on; WGS = Wideband Gapfiller Satellite. RAND...VOICE (DMR VALUES) UFO OE-82 UHF LOS VOICE (DMR VALUES) UHF SATCOM VOICE DMR UHF LOS VOICE DMR TVs ADMS KIV-7 COMSEC ADNS SW ADNS II HW ISNS SW ISNS HW...Data Link; UHF = ultra high frequency; UFO = ultra-high-frequency follow-on; WGS = Wideband Gapfiller Satellite. RAND MG1003-1.1 4 Navy Network

  9. Criminal Network Investigation: Processes, Tools, and Techniques

    DEFF Research Database (Denmark)

    Petersen, Rasmus Rosenqvist

    intelligence products that can be disseminated to their customers. Investigators deal with an increasing amount of information from a variety of sources, especially the Internet, all of which are important to their analysis and decision making process. But information abundance is far from the only or most...... a target-centric process model (acquisition, synthesis, sense-making, dissemination, cooperation) encouraging and supporting an iterative and incremental evolution of the criminal network across all five investigation processes. The first priority of the process model is to address the problems of linear...

  10. A Mobile Network Planning Tool Based on Data Analytics

    Directory of Open Access Journals (Sweden)

    Jessica Moysen

    2017-01-01

    Full Text Available Planning future mobile networks entails multiple challenges due to the high complexity of the network to be managed. Beyond 4G and 5G networks are expected to be characterized by a high densification of nodes and heterogeneity of layers, applications, and Radio Access Technologies (RAT. In this context, a network planning tool capable of dealing with this complexity is highly convenient. The objective is to exploit the information produced by and already available in the network to properly deploy, configure, and optimise network nodes. This work presents such a smart network planning tool that exploits Machine Learning (ML techniques. The proposed approach is able to predict the Quality of Service (QoS experienced by the users based on the measurement history of the network. We select Physical Resource Block (PRB per Megabit (Mb as our main QoS indicator to optimise, since minimizing this metric allows offering the same service to users by consuming less resources, so, being more cost-effective. Two cases of study are considered in order to evaluate the performance of the proposed scheme, one to smartly plan the small cell deployment in a dense indoor scenario and a second one to timely face a detected fault in a macrocell network.

  11. Collaboration tools for the global accelerator network: Workshop Report

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Olson, Gary [Univ. of Michigan, Ann Arbor, MI (United States); Olson, Judy [Univ. of Michigan, Ann Arbor, MI (United States)

    2002-09-15

    The concept of a ''Global Accelerator Network'' (GAN) has been put forward as a means for inter-regional collaboration in the operation of internationally constructed and operated frontier accelerator facilities. A workshop was held to allow representatives of the accelerator community and of the collaboratory development community to meet and discuss collaboration tools for the GAN environment. This workshop, called the Collaboration Tools for the Global Accelerator Network (GAN) Workshop, was held on August 26, 2002 at Lawrence Berkeley National Laboratory. The goal was to provide input about collaboration tools in general and to provide a strawman for the GAN collaborative tools environment. The participants at the workshop represented accelerator physicists, high-energy physicists, operations, technology tool developers, and social scientists that study scientific collaboration.

  12. Planning and management of cloud computing networks

    Science.gov (United States)

    Larumbe, Federico

    The evolution of the Internet has a great impact on a big part of the population. People use it to communicate, query information, receive news, work, and as entertainment. Its extraordinary usefulness as a communication media made the number of applications and technological resources explode. However, that network expansion comes at the cost of an important power consumption. If the power consumption of telecommunication networks and data centers is considered as the power consumption of a country, it would rank at the 5 th place in the world. Furthermore, the number of servers in the world is expected to grow by a factor of 10 between 2013 and 2020. This context motivates us to study techniques and methods to allocate cloud computing resources in an optimal way with respect to cost, quality of service (QoS), power consumption, and environmental impact. The results we obtained from our test cases show that besides minimizing capital expenditures (CAPEX) and operational expenditures (OPEX), the response time can be reduced up to 6 times, power consumption by 30%, and CO2 emissions by a factor of 60. Cloud computing provides dynamic access to IT resources as a service. In this paradigm, programs are executed in servers connected to the Internet that users access from their computers and mobile devices. The first advantage of this architecture is to reduce the time of application deployment and interoperability, because a new user only needs a web browser and does not need to install software on local computers with specific operating systems. Second, applications and information are available from everywhere and with any device with an Internet access. Also, servers and IT resources can be dynamically allocated depending on the number of users and workload, a feature called elasticity. This thesis studies the resource management of cloud computing networks and is divided in three main stages. We start by analyzing the planning of cloud computing networks to get a

  13. Rahnuma: hypergraph-based tool for metabolic pathway prediction and network comparison.

    Science.gov (United States)

    Mithani, Aziz; Preston, Gail M; Hein, Jotun

    2009-07-15

    We present a tool called Rahnuma for prediction and analysis of metabolic pathways and comparison of metabolic networks. Rahnuma represents metabolic networks as hypergraphs and computes all possible pathways between two or more metabolites. It provides an intuitive way to answer biological ques- tions focusing on differences between organisms or the evolution of different species by allowing pathway-based metabolic network comparisons at an organism as well as at a phylogenetic level. Rahnuma is available online at http://portal.stats.ox.ac.uk:8080/rahnuma/.

  14. 2013 International Conference on Computer Engineering and Network

    CERN Document Server

    Zhu, Tingshao

    2014-01-01

    This book aims to examine innovation in the fields of computer engineering and networking. The book covers important emerging topics in computer engineering and networking, and it will help researchers and engineers improve their knowledge of state-of-art in related areas. The book presents papers from The Proceedings of the 2013 International Conference on Computer Engineering and Network (CENet2013) which was held on July 20-21, in Shanghai, China.

  15. AUTOMATIC CONTROL OF INTELLECTUAL RIGHTS IN THE GLOBAL COMPUTER NETWORKS

    OpenAIRE

    Anatoly P. Yakimaho; Victoriya V. Bessarabova

    2013-01-01

    The problems of use of subjects of intellectual property in the global computer networks are stated. The main attention is focused on the ways of problems solutions arising during the work in computer networks. Legal problems of information society are considered. The analysis of global computer networks as places for the organization of collective management by copyrights in the world scale is carried out. Issues of creation of a system of automatic control of property rights of authors and ...

  16. High Performance Networks From Supercomputing to Cloud Computing

    CERN Document Server

    Abts, Dennis

    2011-01-01

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applicati

  17. Study of Tools for Network Discovery and Network Mapping

    Science.gov (United States)

    2003-11-01

    DISCOVERY OptiView Console supports central and distributed architectures. OptiView Console consists of the Viewer and the Service Manager that...OptiView console. Service Manager is the engine that performs network discovery, data management, data analysis, and provides notification services...The Service Manager gives you status information and configuration control of the services that are part of the OptiView Console application. These

  18. Computational tools for in silico fragment-based drug design.

    Science.gov (United States)

    Mortier, Jeremie; Rakers, Christin; Frederick, Raphael; Wolber, Gerhard

    2012-01-01

    Fragment-based strategy in drug design involves the initial discovery of low-molecular mass molecules. Owing to their small-size, fragments are molecular tools to probe specific sub-pockets within a protein active site. Once their interaction within the enzyme cavity is clearly understood and experimentally validated, they represent a unique opportunity to design potent and efficient larger compounds. Computer-aided methods can essentially support the identification of suitable fragments. In this review, available tools for computational drug design are discussed in the frame of fragmentbased approaches. We analyze and review (i) available commercial fragment libraries with respect to their properties and size, (ii) computational methods for the construction of such a library, (iii) the different strategies and software packages for the selection of the fragments with predicted affinity to a given target, and (iv) tools for the in silico linkage of fragments into an actual high-affinity lead structure candidate.

  19. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    Science.gov (United States)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  20. An overview of existing modeling tools making use of model checking in the analysis of biochemical networks.

    Science.gov (United States)

    Carrillo, Miguel; Góngora, Pedro A; Rosenblueth, David A

    2012-01-01

    Model checking is a well-established technique for automatically verifying complex systems. Recently, model checkers have appeared in computer tools for the analysis of biochemical (and gene regulatory) networks. We survey several such tools to assess the potential of model checking in computational biology. Next, our overview focuses on direct applications of existing model checkers, as well as on algorithms for biochemical network analysis influenced by model checking, such as those using binary decision diagrams (BDDs) or Boolean-satisfiability solvers. We conclude with advantages and drawbacks of model checking for the analysis of biochemical networks.

  1. Importance of simulation tools for the planning of optical network

    Science.gov (United States)

    Martins, Indayara B.; Martins, Yara; Rudge, Felipe; Moschimı, Edson

    2015-10-01

    The main proposal of this work is to show the importance of using simulation tools to project optical networks. The simulation method supports the investigation of several system and network parameters, such as bit error rate, blocking probability as well as physical layer issues, such as attenuation, dispersion, and nonlinearities, as these are all important to evaluate and validate the operability of optical networks. The work was divided into two parts: firstly, physical layer preplanning was proposed for the distribution of amplifiers and compensating for the attenuation and dispersion effects in span transmission; in this part, we also analyzed the quality of the transmitted signal. In the second part, an analysis of the transport layer was completed, proposing wavelength distribution planning, according to the total utilization of each link. The main network parameters used to evaluate the transport and physical layer design were delay (latency), blocking probability, and bit error rate (BER). This work was carried out with commercially available simulation tools.

  2. ATHENA: the analysis tool for heritable and environmental network associations.

    Science.gov (United States)

    Holzinger, Emily R; Dudek, Scott M; Frase, Alex T; Pendergrass, Sarah A; Ritchie, Marylyn D

    2014-03-01

    Advancements in high-throughput technology have allowed researchers to examine the genetic etiology of complex human traits in a robust fashion. Although genome-wide association studies have identified many novel variants associated with hundreds of traits, a large proportion of the estimated trait heritability remains unexplained. One hypothesis is that the commonly used statistical techniques and study designs are not robust to the complex etiology that may underlie these human traits. This etiology could include non-linear gene × gene or gene × environment interactions. Additionally, other levels of biological regulation may play a large role in trait variability. To address the need for computational tools that can explore enormous datasets to detect complex susceptibility models, we have developed a software package called the Analysis Tool for Heritable and Environmental Network Associations (ATHENA). ATHENA combines various variable filtering methods with machine learning techniques to analyze high-throughput categorical (i.e. single nucleotide polymorphisms) and quantitative (i.e. gene expression levels) predictor variables to generate multivariable models that predict either a categorical (i.e. disease status) or quantitative (i.e. cholesterol levels) outcomes. The goal of this article is to demonstrate the utility of ATHENA using simulated and biological datasets that consist of both single nucleotide polymorphisms and gene expression variables to identify complex prediction models. Importantly, this method is flexible and can be expanded to include other types of high-throughput data (i.e. RNA-seq data and biomarker measurements). ATHENA is freely available for download. The software, user manual and tutorial can be downloaded from http://ritchielab.psu.edu/ritchielab/software.

  3. Email networks and the spread of computer viruses

    Science.gov (United States)

    Newman, M. E.; Forrest, Stephanie; Balthrop, Justin

    2002-09-01

    Many computer viruses spread via electronic mail, making use of computer users' email address books as a source for email addresses of new victims. These address books form a directed social network of connections between individuals over which the virus spreads. Here we investigate empirically the structure of this network using data drawn from a large computer installation, and discuss the implications of this structure for the understanding and prevention of computer virus epidemics.

  4. Operatory-logistic analysis method of computational tools

    OpenAIRE

    Alejandra Behar, Patricia; Bortolozo Pivoto, Deise; Santos da Silveira, Fabiana

    2012-01-01

    The present contribution reports results obtained by the group which investigate the Operative Analysis of Computational Environments. This group belongs to the Nucleous of Digital Technology applied in Education (NUTED), of the Education School of the Federal University of Rio Grande do Sul. Basically we use the Piagetian Theory, in particular, the logical-operatory model, to construct a methodology to analyse computational tools. Therefore, we have to define the basic concepts in a fr...

  5. An Overview of Computer Network security and Research Technology

    OpenAIRE

    Rathore, Vandana

    2016-01-01

    The rapid development in the field of computer networks and systems brings both convenience and security threats for users. Security threats include network security and data security. Network security refers to the reliability, confidentiality, integrity and availability of the information in the system. The main objective of network security is to maintain the authenticity, integrity, confidentiality, availability of the network. This paper introduces the details of the technologies used in...

  6. Three-Phase Unbalanced Load Flow Tool for Distribution Networks

    DEFF Research Database (Denmark)

    Demirok, Erhan; Kjær, Søren Bækhøj; Sera, Dezso

    2012-01-01

    This work develops a three-phase unbalanced load flow tool tailored for radial distribution networks based on Matlab®. The tool can be used to assess steady-state voltage variations, thermal limits of grid components and power losses in radial MV-LV networks with photovoltaic (PV) generators where...... most of the systems are single phase. New ancillary service such as static reactive power support by PV inverters can be also merged together with the load flow solution tool and thus, the impact of the various reactive power control strategies on the steady-state grid operation can be simply...... investigated. Performance of the load flow solution tool in the sense of resulting bus voltage magnitudes is compared and validated with IEEE 13-bus test feeder....

  7. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets......, as well as modification and analysis. Graphical work stations provide the opportunity to work — not only with textual representations of Petri nets — but also directly with the graphical representations. This paper describes some of the different kinds of tools which are needed in the Petri net area...

  8. New Tools for Computational Geometry and Rejuvenation of Screw Theory

    Science.gov (United States)

    Hestenes, David

    Conformal Geometric Algebraic (CGA) provides ideal mathematical tools for construction, analysis, and integration of classical Euclidean, Inversive & Projective Geometries, with practical applications to computer science, engineering, and physics. This paper is a comprehensive introduction to a CGA tool kit. Synthetic statements in classical geometry translate directly to coordinate-free algebraic forms. Invariant and covariant methods are coordinated by conformal splits, which are readily related to the literature using methods of matrix algebra, biquaternions, and screw theory. Designs for a complete system of powerful tools for the mechanics of linked rigid bodies are presented.

  9. Neuronmaster: an integrated tool for applications in neural networks

    Science.gov (United States)

    Rivas-Echeverria, Francklin; Colina-Morles, Eliezer; Sole, Solazver; Perez-Mendez, Anna; Bravo-Bravo, Cesar; Bravo-Bravo, Victor

    2001-03-01

    This work presents the design of an integral environment for the suitable development of neural networks applications. The integrated environment contemplates the following features: A data processing module which encompasses statistical data analysis techniques for variables selection reduction, a variety of learning algorithms, code generator for different computer languages to enable network implementation, a learning sessions planning module and database connectivity facilities via ODBC, RPC, and API.

  10. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  11. Social network sites: Indispensable or optional social tools?

    DEFF Research Database (Denmark)

    Shklovski, Irina

    2012-01-01

    Much research has enumerated potential benefits of online social network sites. Given the pervasiveness of these sites and the numbers of people that use them daily, both re-search and media tend to make the assumption that social network sites have become indispensible to their users. Based...... on the analysis of qualitative data from users of social network sites in Russia and Kazakhstan, this paper consid-ers under what conditions social network sites can become indispensable to their users and when these technologies remain on the periphery of life despite fulfilling useful func-tions. For some...... respondents, these sites had become indis-pensable tools as they were integrated into everyday rou-tines of communicating with emotionally important and proximal contacts and were often used for coordination of offline activities. For others social network sites remained spaces where they occasionally visited...

  12. RMOD: a tool for regulatory motif detection in signaling network.

    Directory of Open Access Journals (Sweden)

    Jinki Kim

    Full Text Available Regulatory motifs are patterns of activation and inhibition that appear repeatedly in various signaling networks and that show specific regulatory properties. However, the network structures of regulatory motifs are highly diverse and complex, rendering their identification difficult. Here, we present a RMOD, a web-based system for the identification of regulatory motifs and their properties in signaling networks. RMOD finds various network structures of regulatory motifs by compressing the signaling network and detecting the compressed forms of regulatory motifs. To apply it into a large-scale signaling network, it adopts a new subgraph search algorithm using a novel data structure called path-tree, which is a tree structure composed of isomorphic graphs of query regulatory motifs. This algorithm was evaluated using various sizes of signaling networks generated from the integration of various human signaling pathways and it showed that the speed and scalability of this algorithm outperforms those of other algorithms. RMOD includes interactive analysis and auxiliary tools that make it possible to manipulate the whole processes from building signaling network and query regulatory motifs to analyzing regulatory motifs with graphical illustration and summarized descriptions. As a result, RMOD provides an integrated view of the regulatory motifs and mechanism underlying their regulatory motif activities within the signaling network. RMOD is freely accessible online at the following URL: http://pks.kaist.ac.kr/rmod.

  13. RMOD: a tool for regulatory motif detection in signaling network.

    Science.gov (United States)

    Kim, Jinki; Yi, Gwan-Su

    2013-01-01

    Regulatory motifs are patterns of activation and inhibition that appear repeatedly in various signaling networks and that show specific regulatory properties. However, the network structures of regulatory motifs are highly diverse and complex, rendering their identification difficult. Here, we present a RMOD, a web-based system for the identification of regulatory motifs and their properties in signaling networks. RMOD finds various network structures of regulatory motifs by compressing the signaling network and detecting the compressed forms of regulatory motifs. To apply it into a large-scale signaling network, it adopts a new subgraph search algorithm using a novel data structure called path-tree, which is a tree structure composed of isomorphic graphs of query regulatory motifs. This algorithm was evaluated using various sizes of signaling networks generated from the integration of various human signaling pathways and it showed that the speed and scalability of this algorithm outperforms those of other algorithms. RMOD includes interactive analysis and auxiliary tools that make it possible to manipulate the whole processes from building signaling network and query regulatory motifs to analyzing regulatory motifs with graphical illustration and summarized descriptions. As a result, RMOD provides an integrated view of the regulatory motifs and mechanism underlying their regulatory motif activities within the signaling network. RMOD is freely accessible online at the following URL: http://pks.kaist.ac.kr/rmod.

  14. Computer-based tools to support curriculum developers

    NARCIS (Netherlands)

    Nieveen, N.M.; Gustafson, Kent

    2000-01-01

    Since the start of the early 90’s, an increasing number of people are interested in supporting the complex tasks of the curriculum development process with computer-based tools. ‘Curriculum development’ refers to an intentional process or activity directed at (re) designing, developing and

  15. Multimedia Instructional Tools and Student Learning in Computer Applications Courses

    Science.gov (United States)

    Chapman, Debra Laier

    2013-01-01

    Advances in technology and changes in educational strategies have resulted in the integration of technology into the classroom. Multimedia instructional tools (MMIT) have been identified as a way to provide student-centered active-learning instructional material to students. MMITs are common in introductory computer applications courses based on…

  16. Integrating Computer-Assisted Translation Tools into Language Learning

    Science.gov (United States)

    Fernández-Parra, María

    2016-01-01

    Although Computer-Assisted Translation (CAT) tools play an important role in the curriculum in many university translator training programmes, they are seldom used in the context of learning a language, as a good command of a language is needed before starting to translate. Since many institutions often have translator-training programmes as well…

  17. A computer aided tolerancing tool, II: tolerance analysis

    NARCIS (Netherlands)

    Salomons, O.W.; Haalboom, F.J.; Jonge poerink, H.J.; van Slooten, F.; van Slooten, F.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1996-01-01

    A computer aided tolerance analysis tool is presented that assists the designer in evaluating worst case quality of assembly after tolerances have been specified. In tolerance analysis calculations, sets of equations are generated. The number of equations can be restricted by using a minimum number

  18. Cartoons beyond Clipart: A Computer Tool for Storyboarding and Storywriting

    Science.gov (United States)

    Madden, M.; Chung, P. W. H.; Dawson, C. W.

    2009-01-01

    This paper describes the motivation, proposal, and early prototype testing of a computer tool for story visualisation. An analysis of current software for making various types of visual story is made; this identifies a gap between software which emphasises preset banks of artwork, and software which emphasises low-level construction and/or…

  19. Computational tools for the synthetic design of biochemical pathways

    NARCIS (Netherlands)

    Medema, Marnix H.; van Raaphorst, Renske; Takano, Eriko; Breitling, Rainer

    As the field of synthetic biology is developing, the prospects for de novo design of biosynthetic pathways are becoming more and more realistic. Hence, there is an increasing need for computational tools that can support these efforts. A range of algorithms has been developed that can be used to

  20. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  1. Evaluation of a computer-based nutrition education tool.

    Science.gov (United States)

    Kreisel, Katrin

    2004-04-01

    To evaluate the efficacy and feasibility of using a computer-based teaching tool (http://www.coolfoodplanet.org) for nutrition and lifestyle education developed for primary school children. This was a 2-week school-based intervention in third and fourth grades. The study design was multi-factorial with repeated measures of nutrition knowledge, at three points in time, of dependent samples from control and intervention groups. Control schools (n=7) used 'traditional' nutrition education materials and intervention schools (n=8) additionally used the computer-based educational tool. Qualitative information was collected in focus group discussions with student teachers and pupils, and by observing the nutrition lessons. Pupils aged 8-11 years (n=271) from participating schools in Vienna, Austria. Nutrition knowledge increased significantly in both intervention and control schools, irrespective of the teaching tool used (Pnutrition knowledge post intervention or at follow-up between the two study groups. In intervention schools, younger pupils (8-9 years) had better nutrition knowledge than older pupils (10-11 years) (P=0.011). This computer-based tool increases the possibilities of school-based nutrition education. If the tool's weaknesses identified during the formative evaluation are eliminated, it has the potential to make learning about nutrition more enjoyable, exciting and effective. This is of great importance considering that 'healthy' nutrition is not necessarily a topic that easily attracts pupils' attention and in view of the potential long-term health benefits of early and effective nutrition education.

  2. Predictive Behavior of a Computational Foot/Ankle Model through Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Ruchi D. Chande

    2017-01-01

    Full Text Available Computational models are useful tools to study the biomechanics of human joints. Their predictive performance is heavily dependent on bony anatomy and soft tissue properties. Imaging data provides anatomical requirements while approximate tissue properties are implemented from literature data, when available. We sought to improve the predictive capability of a computational foot/ankle model by optimizing its ligament stiffness inputs using feedforward and radial basis function neural networks. While the former demonstrated better performance than the latter per mean square error, both networks provided reasonable stiffness predictions for implementation into the computational model.

  3. A Social Network Approach to Provisioning and Management of Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning

    2011-01-01

    This paper proposes a social network approach to the provisioning and management of cloud computing services termed Opportunistic Cloud Computing Services (OCCS), for enterprises; and presents the research issues that need to be addressed for its implementation. We hypothesise that OCCS...... will facilitate the adoption process of cloud computing services by enterprises. OCCS deals with the concept of enterprises taking advantage of cloud computing services to meet their business needs without having to pay or paying a minimal fee for the services. The OCCS network will be modelled and implemented...... as a social network of enterprises collaborating strategically for the provisioning and consumption of cloud computing services without entering into any business agreements. We conclude that it is possible to configure current cloud service technologies and management tools for OCCS but there is a need...

  4. Throughput capacity computation model for hybrid wireless networks

    African Journals Online (AJOL)

    wireless networks. We present in this paper, a computational model for obtaining throughput capacity for hybrid wireless networks. For a hybrid network with n nodes and m base stations, we observe through simulation that the throughput capacity increases linearly with the base station infrastructure connected by the wired ...

  5. Novel Ethernet Based Optical Local Area Networks for Computer Interconnection

    NARCIS (Netherlands)

    Radovanovic, Igor; van Etten, Wim; Taniman, R.O.; Kleinkiskamp, Ronny

    2003-01-01

    In this paper we present new optical local area networks for fiber-to-the-desk application. Presented networks are expected to bring a solution for having optical fibers all the way to computers. To bring the overall implementation costs down we have based our networks on short-wavelength optical

  6. 4th International Conference on Computer Engineering and Networks

    CERN Document Server

    2015-01-01

    This book aims to examine innovation in the fields of computer engineering and networking. The book covers important emerging topics in computer engineering and networking, and it will help researchers and engineers improve their knowledge of state-of-art in related areas. The book presents papers from the 4th International Conference on Computer Engineering and Networks (CENet2014) held July 19-20, 2014 in Shanghai, China.  ·       Covers emerging topics for computer engineering and networking ·       Discusses how to improve productivity by using the latest advanced technologies ·       Examines innovation in the fields of computer engineering and networking  

  7. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  8. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.

    Science.gov (United States)

    Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T

    2015-07-15

    While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks

  9. Development and psychometric testing of the clinical networks engagement tool.

    Directory of Open Access Journals (Sweden)

    Jill M Norris

    Full Text Available Clinical networks are being used widely to facilitate large system transformation in healthcare, by engagement of stakeholders throughout the health system. However, there are no available instruments that measure engagement in these networks.The study purpose was to develop and assess the measurement properties of a multiprofessional tool to measure engagement in clinical network initiatives. Based on components of the International Association of Public Participation Spectrum and expert panel review, we developed 40 items for testing. The draft instrument was distributed to 1,668 network stakeholders across different governance levels (leaders, members, support, frontline stakeholders in 9 strategic clinical networks in Alberta (January to July 2014. With data from 424 completed surveys (25.4% response rate, descriptive statistics, exploratory and confirmatory factor analysis, Pearson correlations, linear regression, multivariate analysis, and Cronbach alpha were conducted to assess reliability and validity of the scores.Sixteen items were retained in the instrument. Exploratory factor analysis indicated a four-factor solution and accounted for 85.7% of the total variance in engagement with clinical network initiatives: global engagement, inform (provided with information, involve (worked together to address concerns, and empower (given final decision-making authority. All subscales demonstrated acceptable reliability (Cronbach alpha 0.87 to 0.99. Both the confirmatory factor analysis and regression analysis confirmed that inform, involve, and empower were all significant predictors of global engagement, with involve as the strongest predictor. Leaders had higher mean scores than frontline stakeholders, while members and support staff did not differ in mean scores.This study provided foundational evidence for the use of this tool for assessing engagement in clinical networks. Further work is necessary to evaluate engagement in broader network

  10. Second International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Konar, Amit; Chakraborty, Aruna

    2014-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two-volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 148 scholarly papers, which have been accepted for presentation from over 640 submissions in the second International Conference on Advanced Computing, Networking and Informatics, 2014, held in Kolkata, India during June 24-26, 2014. The first volume includes innovative computing techniques and relevant research results in informatics with selective applications in pattern recognition, signal/image process...

  11. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  12. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    CERN Document Server

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  13. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  14. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  15. Network selection, Information filtering and Scalable computation

    Science.gov (United States)

    Ye, Changqing

    This dissertation explores two application scenarios of sparsity pursuit method on large scale data sets. The first scenario is classification and regression in analyzing high dimensional structured data, where predictors corresponds to nodes of a given directed graph. This arises in, for instance, identification of disease genes for the Parkinson's diseases from a network of candidate genes. In such a situation, directed graph describes dependencies among the genes, where direction of edges represent certain causal effects. Key to high-dimensional structured classification and regression is how to utilize dependencies among predictors as specified by directions of the graph. In this dissertation, we develop a novel method that fully takes into account such dependencies formulated through certain nonlinear constraints. We apply the proposed method to two applications, feature selection in large margin binary classification and in linear regression. We implement the proposed method through difference convex programming for the cost function and constraints. Finally, theoretical and numerical analyses suggest that the proposed method achieves the desired objectives. An application to disease gene identification is presented. The second application scenario is personalized information filtering which extracts the information specifically relevant to a user, predicting his/her preference over a large number of items, based on the opinions of users who think alike or its content. This problem is cast into the framework of regression and classification, where we introduce novel partial latent models to integrate additional user-specific and content-specific predictors, for higher predictive accuracy. In particular, we factorize a user-over-item preference matrix into a product of two matrices, each representing a user's preference and an item preference by users. Then we propose a likelihood method to seek a sparsest latent factorization, from a class of over

  16. CATIA Core Tools Computer Aided Three-Dimensional Interactive Application

    CERN Document Server

    Michaud, Michel

    2012-01-01

    CATIA Core Tools: Computer-Aided Three-Dimensional Interactive Application explains how to use the essential features of this cutting-edge solution for product design and innovation. The book begins with the basics, such as launching the software, configuring the settings, and managing files. Next, you'll learn about sketching, modeling, drafting, and visualization tools and techniques. Easy-to-follow instructions along with detailed illustrations and screenshots help you get started using several CATIA workbenches right away. Reverse engineering--a valuable product development skill--is also covered in this practical resource.

  17. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  18. Field-programmable custom computing technology architectures, tools, and applications

    CERN Document Server

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  19. Extending peripersonal space representation without tool-use: evidence from a combined behavioural-computational approach

    Directory of Open Access Journals (Sweden)

    Andrea eSerino

    2015-02-01

    Full Text Available Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e. peripersonal space (PPS. PPS dynamically modifies depending on experience, e.g. it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioural approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e. selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioural experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioural settings showed that asynchronous tactile and auditory inputs did not change PPS. We conclude by proposing a biological-plausible model to explain plasticity in PPS representation after tool-use, supported by computational and behavioural data.

  20. 3rd International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Chaki, Nabendu

    2016-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

  1. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  2. Computational design approaches and tools for synthetic biology.

    Science.gov (United States)

    MacDonald, James T; Barnes, Chris; Kitney, Richard I; Freemont, Paul S; Stan, Guy-Bart V

    2011-02-01

    A proliferation of new computational methods and software tools for synthetic biology design has emerged in recent years but the field has not yet reached the stage where the design and construction of novel synthetic biology systems has become routine. To a large degree this is due to the inherent complexity of biological systems. However, advances in biotechnology and our scientific understanding have already enabled a number of significant achievements in this area. A key concept in engineering is the ability to assemble simpler standardised modules into systems of increasing complexity but it has yet to be adequately addressed how this approach can be applied to biological systems. In particular, the use of computer aided design tools is common in other engineering disciplines and it should eventually become centrally important to the field of synthetic biology if the challenge of dealing with the stochasticity and complexity of biological systems can be overcome.

  3. Dynamics of Bottlebrush Networks: A Computational Study

    Science.gov (United States)

    Dobrynin, Andrey; Cao, Zhen; Sheiko, Sergei

    We study dynamics of deformation of bottlebrush networks using molecular dynamics simulations and theoretical calculations. Analysis of our simulation results show that the dynamics of bottlebrush network deformation can be described by a Rouse model for polydisperse networks with effective Rouse time of the bottlebrush network strand, τR =τ0Ns2 (Nsc + 1) where, Ns is the number-average degree of polymerization of the bottlebrush backbone strands between crosslinks, Nsc is the degree of polymerization of the side chains and τ0is a characteristic monomeric relaxation time. At time scales t smaller than the Rouse time, t crosslinks, the network response is pure elastic with shear modulus G (t) =G0 , where G0 is the equilibrium shear modulus at small deformation. The stress evolution in the bottlebrush networks can be described by a universal function of t /τR . NSF DMR-1409710.

  4. Complex network problems in physics, computer science and biology

    Science.gov (United States)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  5. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin Nasaruddin

    2013-09-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  6. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin

    2009-11-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  7. Understanding organometallic reaction mechanisms and catalysis experimental and computational tools computational and experimental tools

    CERN Document Server

    Ananikov, Valentin P

    2014-01-01

    Exploring and highlighting the new horizons in the studies of reaction mechanisms that open joint application of experimental studies and theoretical calculations is the goal of this book. The latest insights and developments in the mechanistic studies of organometallic reactions and catalytic processes are presented and reviewed. The book adopts a unique approach, exemplifying how to use experiments, spectroscopy measurements, and computational methods to reveal reaction pathways and molecular structures of catalysts, rather than concentrating solely on one discipline. The result is a deeper

  8. Final Report for Foundational Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Hollingsworth, Jeff [Univ. of Maryland, College Park, MD (United States)

    2015-02-12

    This project concentrated on various aspects of creating tool infrastructure to make it easier to program large-scale parallel computers. This project was collaborative with the University of Wisconsin and closely related to the project DE-SC0002606 (“Tools for the Development of High Performance Energy Applications and Systems”) . The research conducted during this project is summarized in this report. The complete details of the work are available in the ten publications listed at the end of the report. Many of the concepts created during this project have been incorporated into tools and made available as freely downloadable software (at www.dyninst.org). It also supported the Ph.D. studies of three students and one research staff member.

  9. WaveJava: Wavelet-based network computing

    Science.gov (United States)

    Ma, Kun; Jiao, Licheng; Shi, Zhuoer

    1997-04-01

    Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.

  10. Computational quality control tools for mass spectrometry proteomics.

    Science.gov (United States)

    Bittremieux, Wout; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris

    2017-02-01

    As mass-spectrometry-based proteomics has matured during the past decade, a growing emphasis has been placed on quality control. For this purpose, multiple computational quality control tools have been introduced. These tools generate a set of metrics that can be used to assess the quality of a mass spectrometry experiment. Here we review which types of quality control metrics can be generated, and how they can be used to monitor both intra- and inter-experiment performances. We discuss the principal computational tools for quality control and list their main characteristics and applicability. As most of these tools have specific use cases, it is not straightforward to compare their performances. For this survey, we used different sets of quality control metrics derived from information at various stages in a mass spectrometry process and evaluated their effectiveness at capturing qualitative information about an experiment using a supervised learning approach. Furthermore, we discuss currently available algorithmic solutions that enable the usage of these quality control metrics for decision-making. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Computing properties of stable configurations of thermodynamic binding networks

    OpenAIRE

    Breik, Keenan; Prakash, Lakshmi; Thachuk, Chris; Heule, Marijn; Soloveichik, David

    2017-01-01

    Models of molecular computing generally embed computation in kinetics--the specific time evolution of a chemical system. However, if the desired output is not thermodynamically stable, basic physical chemistry dictates that thermodynamic forces will drive the system toward error throughout the computation. The Thermodynamic Binding Network (TBN) model was introduced to formally study how the thermodynamic equilibrium can be made consistent with the desired computation, and it idealizes bindin...

  12. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    Science.gov (United States)

    1994-08-10

    23 Haddock, J. and O’Keefe, R., "Using Artificial Intelligence to Facilitate Manufacturing Systems Simulation," Computers & Industrial Engineering , Vol...Feedforward Neural Networks," Computers & Industrial Engineering , Vol. 21, No. 1- 4, (1991), pp. 247-251. 87 Proceedings of the 1992 Summer Computer...Using Simulation Experiments," Computers & Industrial Engineering , Vol. 22, No. 2 (1992), pp. 195-209. 119 Kuei, C. and Madu, C., "Polynomial

  13. Wireless Networks: New Meaning to Ubiquitous Computing.

    Science.gov (United States)

    Drew, Wilfred, Jr.

    2003-01-01

    Discusses the use of wireless technology in academic libraries. Topics include wireless networks; standards (IEEE 802.11); wired versus wireless; why libraries implement wireless technology; wireless local area networks (WLANs); WLAN security; examples of wireless use at Indiana State University and Morrisville College (New York); and useful…

  14. CFD Optimization on Network-Based Parallel Computer System

    Science.gov (United States)

    Cheung, Samson H.; VanDalsem, William (Technical Monitor)

    1994-01-01

    Combining multiple engineering workstations into a network-based heterogeneous parallel computer allows application of aerodynamic optimization with advance computational fluid dynamics codes, which is computationally expensive in mainframe supercomputer. This paper introduces a nonlinear quasi-Newton optimizer designed for this network-based heterogeneous parallel computer on a software called Parallel Virtual Machine. This paper will introduce the methodology behind coupling a Parabolized Navier-Stokes flow solver to the nonlinear optimizer. This parallel optimization package has been applied to reduce the wave drag of a body of revolution and a wing/body configuration with results of 5% to 6% drag reduction.

  15. RTOL: design and implementation of an network equipment testing tool

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.Y.; Kim, H.J.; Kim, B.S.; Park, K.H.; An, S.S [Korea University, Seoul (Korea, Republic of); Choi, H.S.; Yu, S.H. [Samsung Electronics, Suwon (Korea, Republic of)

    1997-11-01

    As the infrastructure of information communication network becomes larger and more complicated, many network equipment are being developed. To verify the reliability of such equipment, many test methods have been proposed. But those require a lot of cost and efforts. In this paper, we designed and implemented a test tool, called RTOL(Router Testing command Language system), to verify the functions of network equipment, especially router. RTOL can be used to test OSPF, Appletalk, DecNet, as well as IP and supports the functions of SNMP manager. By using the virtual router functions of RTOL, we can operate many virtual routers with only one router. Finally, we present test results of specific routers by using RTOL. (author). 17 refs., 8 figs., 5 tabs.

  16. Phoebus: Network Middleware for Next-Generation Network Computing

    Energy Technology Data Exchange (ETDEWEB)

    Martin Swany

    2012-06-16

    The Phoebus project investigated algorithms, protocols, and middleware infrastructure to improve end-to-end performance in high speed, dynamic networks. The Phoebus system essentially serves as an adaptation point for networks with disparate capabilities or provisioning. This adaptation can take a variety of forms including acting as a provisioning agent across multiple signaling domains, providing transport protocol adaptation points, and mapping between distributed resource reservation paradigms and the optical network control plane. We have successfully developed the system and demonstrated benefits. The Phoebus system was deployed in Internet2 and in ESnet, as well as in GEANT2, RNP in Brazil and over international links to Korea and Japan. Phoebus is a system that implements a new protocol and associated forwarding infrastructure for improving throughput in high-speed dynamic networks. It was developed to serve the needs of large DOE applications on high-performance networks. The idea underlying the Phoebus model is to embed Phoebus Gateways (PGs) in the network as on-ramps to dynamic circuit networks. The gateways act as protocol translators that allow legacy applications to use dedicated paths with high performance.

  17. Design of computational retrobiosynthesis tools for the design of de novo synthetic pathways.

    Science.gov (United States)

    Hadadi, Noushin; Hatzimanikatis, Vassily

    2015-10-01

    Designing putative metabolic pathways is of great interest in synthetic biology. Retrobiosynthesis is a discipline that involves the design, evaluation, and optimization of de novo biosynthetic pathways for the production of high-value compounds and drugs from renewable resources and natural or engineered enzymes. The best candidate pathways are then engineered within a metabolic network of microorganisms that serve as synthetic platforms for synthetic biology. The complexity of biological chemistry and metabolism requires computational approaches to explore the full possibilities of engineering synthetic pathways towards target compounds. Herein, we discuss recent developments in the design of computational tools for retrosynthetic biochemistry and outline the workflow and design elements for such tools. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Computational tools and algorithms for designing customized synthetic genes.

    Science.gov (United States)

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  19. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Directory of Open Access Journals (Sweden)

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  20. Natural language processing tools for computer assisted language learning

    Directory of Open Access Journals (Sweden)

    Vandeventer Faltin, Anne

    2003-01-01

    Full Text Available This paper illustrates the usefulness of natural language processing (NLP tools for computer assisted language learning (CALL through the presentation of three NLP tools integrated within a CALL software for French. These tools are (i a sentence structure viewer; (ii an error diagnosis system; and (iii a conjugation tool. The sentence structure viewer helps language learners grasp the structure of a sentence, by providing lexical and grammatical information. This information is derived from a deep syntactic analysis. Two different outputs are presented. The error diagnosis system is composed of a spell checker, a grammar checker, and a coherence checker. The spell checker makes use of alpha-codes, phonological reinterpretation, and some ad hoc rules to provide correction proposals. The grammar checker employs constraint relaxation and phonological reinterpretation as diagnosis techniques. The coherence checker compares the underlying "semantic" structures of a stored answer and of the learners' input to detect semantic discrepancies. The conjugation tool is a resource with enhanced capabilities when put on an electronic format, enabling searches from inflected and ambiguous verb forms.

  1. Computationally Efficient Neural Network Intrusion Security Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Todd Vollmer; Milos Manic

    2009-08-01

    An enhanced version of an algorithm to provide anomaly based intrusion detection alerts for cyber security state awareness is detailed. A unique aspect is the training of an error back-propagation neural network with intrusion detection rule features to provide a recognition basis. Network packet details are subsequently provided to the trained network to produce a classification. This leverages rule knowledge sets to produce classifications for anomaly based systems. Several test cases executed on ICMP protocol revealed a 60% identification rate of true positives. This rate matched the previous work, but 70% less memory was used and the run time was reduced to less than 1 second from 37 seconds.

  2. An efficient algorithm for computing attractors of synchronous and asynchronous Boolean networks.

    Science.gov (United States)

    Zheng, Desheng; Yang, Guowu; Li, Xiaoyu; Wang, Zhicai; Liu, Feng; He, Lei

    2013-01-01

    Biological networks, such as genetic regulatory networks, often contain positive and negative feedback loops that settle down to dynamically stable patterns. Identifying these patterns, the so-called attractors, can provide important insights for biologists to understand the molecular mechanisms underlying many coordinated cellular processes such as cellular division, differentiation, and homeostasis. Both synchronous and asynchronous Boolean networks have been used to simulate genetic regulatory networks and identify their attractors. The common methods of computing attractors are that start with a randomly selected initial state and finish with exhaustive search of the state space of a network. However, the time complexity of these methods grows exponentially with respect to the number and length of attractors. Here, we build two algorithms to achieve the computation of attractors in synchronous and asynchronous Boolean networks. For the synchronous scenario, combing with iterative methods and reduced order binary decision diagrams (ROBDD), we propose an improved algorithm to compute attractors. For another algorithm, the attractors of synchronous Boolean networks are utilized in asynchronous Boolean translation functions to derive attractors of asynchronous scenario. The proposed algorithms are implemented in a procedure called geneFAtt. Compared to existing tools such as genYsis, geneFAtt is significantly [Formula: see text] faster in computing attractors for empirical experimental systems. The software package is available at https://sites.google.com/site/desheng619/download.

  3. Students Computer Literacy: Covariate For Assessing The Efficacy Of Computer Assisted Learning Tools

    OpenAIRE

    Ronald R. Tidd; Richard Fenzl

    2011-01-01

    The purpose of this paper is to focus attention on the need to more rigorously measure computer-specific student characteristics when assessing the efficacy of computer assisted learning tools and benchmarking a curriculum's impact. It accomplishes this by first modeling learning outcomes assessment, identifying appropriate instruments, and discussing the absence of such measures in accounting education research. Then, the measurement process employed by the authors is discussed. The unsurpri...

  4. Evolving ATLAS Computing For Today’s Networks

    CERN Document Server

    Campana, S; The ATLAS collaboration; Jezequel, S; Negri, G; Serfon, C; Ueda, I

    2012-01-01

    The ATLAS computing infrastructure was designed many years ago based on the assumption of rather limited network connectivity between computing centres. ATLAS sites have been organized in a hierarchical model, where only a static subset of all possible network links can be exploited and a static subset of well connected sites (CERN and the T1s) can cover important functional roles such as hosting master copies of the data. The pragmatic adoption of such simplified approach, in respect of a more relaxed scenario interconnecting all sites, was very beneficial during the commissioning of the ATLAS distributed computing system and essential in reducing the operational cost during the first two years of LHC data taking. In the mean time, networks evolved far beyond this initial scenario: while a few countries are still poorly connected with the rest of the WLCG infrastructure, most of the ATLAS computing centres are now efficiently interlinked. Our operational experience in running the computing infrastructure in ...

  5. Networks and Project Work: Alternative Pedagogies for Writing with Computers.

    Science.gov (United States)

    Susser, Bernard

    1993-01-01

    Describes three main uses of computers for writing as a social activity: networking, telecommunications, and project work. Examines advantages and disadvantages of teaching writing on a network. Argues that reports in the literature and the example of an English as a foreign language writing class show that project work shares most of the…

  6. Computer Networking Strategies for Building Collaboration among Science Educators.

    Science.gov (United States)

    Aust, Ronald

    The development and dissemination of science materials can be associated with technical delivery systems such as the Unified Network for Informatics in Teacher Education (UNITE). The UNITE project was designed to investigate ways for using computer networking to improve communications and collaboration among university schools of education and…

  7. SURFmap: A Network Monitoring Tool Based on the Google Maps API

    NARCIS (Netherlands)

    Hofstede, R.J.; Hofstede, R. J.; Fioreze, Tiago

    2009-01-01

    Network monitoring allows network managers to get a better insight in the network traffic transiting in a managed network. In order to make the tasks of a network manager easier, many network monitoring tools are made available for a wide range of purposes (e.g., traffic accounting, performance

  8. Computational and Physical Quality Assurance Tools for Radiotherapy

    Science.gov (United States)

    Graves, Yan Jiang

    Radiation therapy aims at delivering a prescribed amount of radiation dose to cancerous targets while sparing dose to normal organs. Treatment planning and delivery in modern radiotherapy are highly complex. To ensure the accuracy of the delivered dose to a patient, a quality assurance (QA) procedure is needed before the actual treatment delivery. This dissertation aims at developing computational and physical tools to facilitate the QA process. In Chapter 2, we have developed a fast and accurate computational QA tool using a graphics processing unit based Monte Carlo (MC) dose engine. This QA tool aims at identifying any errors in the treatment planning stage and machine delivery process by comparing three dose distributions: planned dose computed by a treatment planning system, planned dose and delivered dose reconstructed using the MC method. Within this tool, several modules have been built. (1) A denoising algorithm to smooth the MC calculated dose. We have also investigated the effects of statistical uncertainty in MC simulations on a commonly used dose comparison metric. (2) A linear accelerator source model with a semi-automatic commissioning process. (3) A fluence generation module. With all these modules, a web application for this QA tool with a user friendly interface has been developed to provide users with easy access to our tool, facilitating its clinical utilizations. Even after an initial treatment plan fulfills the QA requirements, a patient may experience inter-fractional anatomy variations, which compromise the initial plan optimality. To resolve this issue, adaptive radiotherapy (ART) has been proposed, where treatment plan is redesigned based on most recent patient anatomy. In Chapter 3, we have constructed a physical deformable head and neck (HN) phantom with in-vivo dosimetry capability. This phantom resembles HN patient geometry and simulates tumor shrinkage with a high level of realism. The ground truth deformation field can be measured

  9. Computers and the internet: tools for youth empowerment.

    Science.gov (United States)

    Valaitis, Ruta K

    2005-10-04

    Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.

  10. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  11. Neuromorphic computing applications for network intrusion detection systems

    Science.gov (United States)

    Garcia, Raymond C.; Pino, Robinson E.

    2014-05-01

    What is presented here is a sequence of evolving concepts for network intrusion detection. These concepts start with neuromorphic structures for XOR-based signature matching and conclude with computationally based network intrusion detection system with an autonomous structuring algorithm. There is evidence that neuromorphic computation for network intrusion detection is fractal in nature under certain conditions. Specifically, the neural structure can take fractal form when simple neural structuring is autonomous. A neural structure is fractal by definition when its fractal dimension exceeds the synaptic matrix dimension. The authors introduce the use of fractal dimension of the neuromorphic structure as a factor in the autonomous restructuring feedback loop.

  12. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    2013-01-01

    that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...... that adopt different approaches to computing the query. Algorithm AUG uses graph augmentation, and ITE uses iterative road-network partitioning. Empirical studies with real data sets demonstrate that the algorithms are capable of offering high performance in realistic settings....

  13. Computational tools for the synthetic design of biochemical pathways.

    Science.gov (United States)

    Medema, Marnix H; van Raaphorst, Renske; Takano, Eriko; Breitling, Rainer

    2012-01-23

    As the field of synthetic biology is developing, the prospects for de novo design of biosynthetic pathways are becoming more and more realistic. Hence, there is an increasing need for computational tools that can support these efforts. A range of algorithms has been developed that can be used to identify all possible metabolic pathways and their corresponding enzymatic parts. These can then be ranked according to various properties and modelled in an organism-specific context. Finally, design software can aid the biologist in the integration of a selected pathway into smartly regulated transcriptional units. Here, we review key existing tools and offer suggestions for how informatics can help to shape the future of synthetic microbiology.

  14. Benefits and Pitfalls: Simple Guidelines for the Use of Social Networking Tools in K-12 Education

    Science.gov (United States)

    Huffman, Stephanie

    2013-01-01

    The article will outline a framework for the use of social networking tools in K-12 education framed around four thought provoking questions: 1) what are the benefits and pitfalls of using social networking tools in P-12 education, 2) how do we plan effectively for the use of social networking tool, 3) what role does professional development play…

  15. Social Networking Tools and Teacher Education Learning Communities: A Case Study

    Science.gov (United States)

    Poulin, Michael T.

    2014-01-01

    Social networking tools have become an integral part of a pre-service teacher's educational experience. As a result, the educational value of social networking tools in teacher preparation programs must be examined. The specific problem addressed in this study is that the role of social networking tools in teacher education learning communities…

  16. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    for traffic classification, which can be used for nearly real-time processing of big amounts of data using affordable CPU and memory resources. Other questions are related to methods for real-time estimation of the application Quality of Service (QoS) level based on the results obtained by the traffic......Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models...... classifier. This thesis is focused on topics connected with traffic classification and analysis, while the work on methods for QoS assessment is limited to defining the connections with the traffic classification and proposing a general algorithm. We introduced the already known methods for traffic...

  17. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  18. A computer aided engineering tool for ECLS systems

    Science.gov (United States)

    Bangham, Michal E.; Reuter, James L.

    1987-01-01

    The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.

  19. Optical interconnection networks for high-performance computing systems.

    Science.gov (United States)

    Biberman, Aleksandr; Bergman, Keren

    2012-04-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers.

  20. Active system area networks for data intensive computations. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-04-01

    The goal of the Active System Area Networks (ASAN) project is to develop hardware and software technologies for the implementation of active system area networks (ASANs). The use of the term ''active'' refers to the ability of the network interfaces to perform application-specific as well as system level computations in addition to their traditional role of data transfer. This project adopts the view that the network infrastructure should be an active computational entity capable of supporting certain classes of computations that would otherwise be performed on the host CPUs. The result is a unique network-wide programming model where computations are dynamically placed within the host CPUs or the NIs depending upon the quality of service demands and network/CPU resource availability. The projects seeks to demonstrate that such an approach is a better match for data intensive network-based applications and that the advent of low-cost powerful embedded processors and configurable hardware makes such an approach economically viable and desirable.

  1. Console Networks for Major Computer Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ophir, D; Shepherd, B; Spinrad, R J; Stonehill, D

    1966-07-22

    A concept for interactive time-sharing of a major computer system is developed in which satellite computers mediate between the central computing complex and the various individual user terminals. These techniques allow the development of a satellite system substantially independent of the details of the central computer and its operating system. Although the user terminals' roles may be rich and varied, the demands on the central facility are merely those of a tape drive or similar batched information transfer device. The particular system under development provides service for eleven visual display and communication consoles, sixteen general purpose, low rate data sources, and up to thirty-one typewriters. Each visual display provides a flicker-free image of up to 4000 alphanumeric characters or tens of thousands of points by employing a swept raster picture generating technique directly compatible with that of commercial television. Users communicate either by typewriter or a manually positioned light pointer.

  2. Electromagnetic field computation by network methods

    CERN Document Server

    Felsen, Leopold B; Russer, Peter

    2009-01-01

    This monograph proposes a systematic and rigorous treatment of electromagnetic field representations in complex structures. The book presents new strong models by combining important computational methods. This is the last book of the late Leopold Felsen.

  3. Realistic computer network simulation for network intrusion detection dataset generation

    Science.gov (United States)

    Payer, Garrett

    2015-05-01

    The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.

  4. 1st International Conference on Signal, Networks, Computing, and Systems

    CERN Document Server

    Mohapatra, Durga; Nagar, Atulya; Sahoo, Manmath

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on Signal, Networks, Computing, and Systems (ICSNCS 2016) held at Jawaharlal Nehru University, New Delhi, India during February 25–27, 2016. The book is organized in to two volumes and primarily focuses on theory and applications in the broad areas of communication technology, computer science and information security. The book aims to bring together the latest scientific research works of academic scientists, professors, research scholars and students in the areas of signal, networks, computing and systems detailing the practical challenges encountered and the solutions adopted.

  5. Dynamical Systems Theory for Transparent Symbolic Computation in Neuronal Networks

    OpenAIRE

    Carmantini, Giovanni Sirio

    2017-01-01

    In this thesis, we explore the interface between symbolic and dynamical system computation, with particular regard to dynamical system models of neuronal networks. In doing so, we adhere to a definition of computation as the physical realization of a formal system, where we say that a dynamical system performs a computation if a correspondence can be found between its dynamics on a vectorial space and the formal system’s dynamics on a symbolic space. Guided by this definition, we characterize...

  6. CX: A Scalable, Robust Network for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Peter Cappello

    2002-01-01

    Full Text Available CX, a network-based computational exchange, is presented. The system's design integrates variations of ideas from other researchers, such as work stealing, non-blocking tasks, eager scheduling, and space-based coordination. The object-oriented API is simple, compact, and cleanly separates application logic from the logic that supports interprocess communication and fault tolerance. Computations, of course, run to completion in the presence of computational hosts that join and leave the ongoing computation. Such hosts, or producers, use task caching and prefetching to overlap computation with interprocessor communication. To break a potential task server bottleneck, a network of task servers is presented. Even though task servers are envisioned as reliable, the self-organizing, scalable network of n- servers, described as a sibling-connected height-balanced fat tree, tolerates a sequence of n-1 server failures. Tasks are distributed throughout the server network via a simple "diffusion" process. CX is intended as a test bed for research on automated silent auctions, reputation services, authentication services, and bonding services. CX also provides a test bed for algorithm research into network-based parallel computation.

  7. Signaling networks: information flow, computation, and decision making.

    Science.gov (United States)

    Azeloglu, Evren U; Iyengar, Ravi

    2015-04-01

    Signaling pathways come together to form networks that connect receptors to many different cellular machines. Such networks not only receive and transmit signals but also process information. The complexity of these networks requires the use of computational models to understand how information is processed and how input-output relationships are determined. Two major computational approaches used to study signaling networks are graph theory and dynamical modeling. Both approaches are useful; network analysis (application of graph theory) helps us understand how the signaling network is organized and what its information-processing capabilities are, whereas dynamical modeling helps us determine how the system changes in time and space upon receiving stimuli. Computational models have helped us identify a number of emergent properties that signaling networks possess. Such properties include ultrasensitivity, bistability, robustness, and noise-filtering capabilities. These properties endow cell-signaling networks with the ability to ignore small or transient signals and/or amplify signals to drive cellular machines that spawn numerous physiological functions associated with different cell states. Copyright © 2015 Cold Spring Harbor Laboratory Press; all rights reserved.

  8. Integrating Network Management for Cloud Computing Services

    Science.gov (United States)

    2015-06-01

    DeviceConfigIsControl- lable is calculated based on whether the device is powered up, whether the device can be reachable via SSH /Telnet from the management network...lines of C# and C++ code, plus a number of internal libraries . At its core, it is a highly-available RESTful web service with persistent storage. Below

  9. Propagation models for computing biochemical reaction networks

    OpenAIRE

    Henzinger, Thomas A; Mateescu, Maria

    2011-01-01

    We introduce propagation models, a formalism designed to support general and efficient data structures for the transient analysis of biochemical reaction networks. We give two use cases for propagation abstract data types: the uniformization method and numerical integration. We also sketch an implementation of a propagation abstract data type, which uses abstraction to approximate states.

  10. Predictive Control of Networked Multiagent Systems via Cloud Computing.

    Science.gov (United States)

    Liu, Guo-Ping

    2017-01-18

    This paper studies the design and analysis of networked multiagent predictive control systems via cloud computing. A cloud predictive control scheme for networked multiagent systems (NMASs) is proposed to achieve consensus and stability simultaneously and to compensate for network delays actively. The design of the cloud predictive controller for NMASs is detailed. The analysis of the cloud predictive control scheme gives the necessary and sufficient conditions of stability and consensus of closed-loop networked multiagent control systems. The proposed scheme is verified to characterize the dynamical behavior and control performance of NMASs through simulations. The outcome provides a foundation for the development of cooperative and coordinative control of NMASs and its applications.

  11. APINetworks: A general API for the treatment of complex networks in arbitrary computational environments

    Science.gov (United States)

    Niño, Alfonso; Muñoz-Caro, Camelia; Reyes, Sebastián

    2015-11-01

    The last decade witnessed a great development of the structural and dynamic study of complex systems described as a network of elements. Therefore, systems can be described as a set of, possibly, heterogeneous entities or agents (the network nodes) interacting in, possibly, different ways (defining the network edges). In this context, it is of practical interest to model and handle not only static and homogeneous networks but also dynamic, heterogeneous ones. Depending on the size and type of the problem, these networks may require different computational approaches involving sequential, parallel or distributed systems with or without the use of disk-based data structures. In this work, we develop an Application Programming Interface (APINetworks) for the modeling and treatment of general networks in arbitrary computational environments. To minimize dependency between components, we decouple the network structure from its function using different packages for grouping sets of related tasks. The structural package, the one in charge of building and handling the network structure, is the core element of the system. In this work, we focus in this API structural component. We apply an object-oriented approach that makes use of inheritance and polymorphism. In this way, we can model static and dynamic networks with heterogeneous elements in the nodes and heterogeneous interactions in the edges. In addition, this approach permits a unified treatment of different computational environments. Tests performed on a C++11 version of the structural package show that, on current standard computers, the system can handle, in main memory, directed and undirected linear networks formed by tens of millions of nodes and edges. Our results compare favorably to those of existing tools.

  12. Databases and tools for constructing signal transduction networks in cancer.

    Science.gov (United States)

    Nam, Seungyoon

    2017-01-01

    Traditionally, biologists have devoted their careers to studying individual biological entities of their own interest, partly due to lack of available data regarding that entity. Large, highthroughput data, too complex for conventional processing methods (i.e., "big data"), has accumulated in cancer biology, which is freely available in public data repositories. Such challenges urge biologists to inspect their biological entities of interest using novel approaches, firstly including repository data retrieval. Essentially, these revolutionary changes demand new interpretations of huge datasets at a systems-level, by so called "systems biology". One of the representative applications of systems biology is to generate a biological network from high-throughput big data, providing a global map of molecular events associated with specific phenotype changes. In this review, we introduce the repositories of cancer big data and cutting-edge systems biology tools for network generation, and improved identification of therapeutic targets. [BMB Reports 2017; 50(1): 12-19].

  13. Lensfree Computational Microscopy Tools and their Biomedical Applications

    Science.gov (United States)

    Sencan, Ikbal

    Conventional microscopy has been a revolutionary tool for biomedical applications since its invention several centuries ago. Ability to non-destructively observe very fine details of biological objects in real time enabled to answer many important questions about their structures and functions. Unfortunately, most of these advance microscopes are complex, bulky, expensive, and/or hard to operate, so they could not reach beyond the walls of well-equipped laboratories. Recent improvements in optoelectronic components and computational methods allow creating imaging systems that better fulfill the specific needs of clinics or research related biomedical applications. In this respect, lensfree computational microscopy aims to replace bulky and expensive optical components with compact and cost-effective alternatives through the use of computation, which can be particularly useful for lab-on-a-chip platforms as well as imaging applications in low-resource settings. Several high-throughput on-chip platforms are built with this approach for applications including, but not limited to, cytometry, micro-array imaging, rare cell analysis, telemedicine, and water quality screening. The lack of optical complexity in these lensfree on-chip imaging platforms is compensated by using computational techniques. These computational methods are utilized for various purposes in coherent, incoherent and fluorescent on-chip imaging platforms e.g. improving the spatial resolution, to undo the light diffraction without using lenses, localization of objects in a large volume and retrieval of the phase or the color/spectral content of the objects. For instance, pixel super resolution approaches based on source shifting are used in lensfree imaging platforms to prevent under sampling, Bayer pattern, and aliasing artifacts. Another method, iterative phase retrieval, is utilized to compensate the lack of lenses by undoing the diffraction and removing the twin image noise of in-line holograms

  14. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  15. Low Computational Complexity Network Coding For Mobile Networks

    DEFF Research Database (Denmark)

    Heide, Janus

    2012-01-01

    -flow coding technique. One of the key challenges of this technique is its inherent computational complexity which can lead to high computational load and energy consumption in particular on the mobile platforms that are the target platform in this work. To increase the coding throughput several...... library and will be available for researchers and students in the future. Chapter 1 introduces motivating examples and the state of art when this work commenced. In Chapter 2 selected publications are presented and how their content is related. Chapter 3 presents the main outcome of the work and briefly...

  16. Computational Tools for the Integrated Design of Advanced Nuclear Reactors

    Directory of Open Access Journals (Sweden)

    Nicholas W. Touran

    2017-08-01

    Full Text Available Advanced nuclear reactors offer safe, clean, and reliable energy at the global scale. The development of such devices relies heavily upon computational models, from the pre-conceptual stages through detailed design, licensing, and operation. An integrated reactor modeling framework that enables seamless communication, coupling, automation, and continuous development brings significant new capabilities and efficiencies to the practice of reactor design. In such a system, key performance metrics (e.g., optimal fuel management, peak cladding temperature in design-basis accidents, levelized cost of electricity can be explicitly linked to design inputs (e.g., assembly duct thickness, tolerances, enabling an exceptional level of design consistency. Coupled with high-performance computing, thousands of integrated cases can be executed simultaneously to analyze the full system, perform complete sensitivity studies, and efficiently and robustly evaluate various design tradeoffs. TerraPower has developed such a tool—the Advanced Reactor Modeling Interface (ARMI code system—and has deployed it to support the TerraPower Traveling Wave Reactor design and other innovative energy products currently under development. The ARMI code system employs pre-existing tools with strong pedigrees alongside many new physics and data management modules necessary for innovative design. Verification and validation against previous and new physical measurements, which remain an essential element of any sound design, are being carried out. This paper summarizes the integrated core engineering tools and practices in production at TerraPower.

  17. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  18. FY 1999 Blue Book: Computing, Information, and Communications: Networked Computing for the 21st Century

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — U.S.research and development R and D in computing, communications, and information technologies has enabled unprecedented scientific and engineering advances,...

  19. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  20. Social Networking Tools for Informal Scholarly Communication Prove Popular for Academics at Two Universities

    Directory of Open Access Journals (Sweden)

    Aoife Lawton

    2016-04-01

    Full Text Available Objective – To investigate the adoption, use, perceived impact of, and barriers to using social networking tools for scholarly communication at two universities. Design – Cross-institutional quantitative study using an online survey. Setting – Academics working in the disciplines of the humanities and social sciences at two universities: one in Europe and one in the Middle East. Methods – An online survey was devised based on a previous survey (Al-Aufi, 2007 and informed by relevant research. The survey was piloted by 10 academics at the 2 participating universities. Post pilot it was revised and then circulated to all academics from similar faculties at two universities. Three follow up emails were sent to both sets of academics. The data was analyzed using Statistical Package for the Social Sciences (SPSS software. Descriptive and inferential statistics were analyzed using ANOVA tests. Main Results – The survey achieved a 34% response rate (n=130. The majority of participants were from the university based in the Middle East and were male (70.8%. Most of the responses were from academics under 40 years of age. The use of notebooks was prevalent at both universities. “Notebooks” is used as a term to describe laptops, netbooks, or ultra-book computers. The majority reported use of social networking tools for informal scholarly communication (70.1%, valuing this type of use. 29.9% of respondents reported they do not use social networking tools for this purpose. Barriers were identified as lack of incentive, digital literacy, training, and concerns over Internet security. Among the non-users, barriers included low interest in their use and a perceived lack of relevancy of such tools for scholarly communication. The types of tools used the most were those with social connection functions, such as Facebook and Twitter. The tools used the least were social bookmarking tools. A one-way analysis of variance (ANOVA test indicated that

  1. Dynamic Defensive Posture for Computer Network Defence

    Science.gov (United States)

    2006-12-01

    des algorithmes pour le classement de la sévérité des attaques sur le réseau et des mécanismes permettant d’attribuer une valeur aux éléments...power outages and social engineering attacks. Because it has such a large knowledge base on which to draw, it can reason very thoroughly about network...service attacks, eavesdropping and sniffing attacks on data in transit, or data tampering; more complex still would be models of social engineering

  2. Characterization and Planning for Computer Network Operations

    Science.gov (United States)

    2010-07-01

    Cell phones, personal computers, laptops, and personal digital assistants represent a small number of the technology-based devices used around the...C. Simpson, editors. Assistive Technol- ogy and Artificial Intelligence, Applications in Robotics, User Interfaces and Natural Language Processing...retrieval agents: Experiments with automated web browsing. pages 13–18, 1995. [206] V. A. Siris and F. Papagalou. Application of anomaly detection

  3. Wirelessly powered sensor networks and computational RFID

    CERN Document Server

    2013-01-01

    The Wireless Identification and Sensing Platform (WISP) is the first of a new class of RF-powered sensing and computing systems.  Rather than being powered by batteries, these sensor systems are powered by radio waves that are either deliberately broadcast or ambient.  Enabled by ongoing exponential improvements in the energy efficiency of microelectronics, RF-powered sensing and computing is rapidly moving along a trajectory from impossible (in the recent past), to feasible (today), toward practical and commonplace (in the near future). This book is a collection of key papers on RF-powered sensing and computing systems including the WISP.  Several of the papers grew out of the WISP Challenge, a program in which Intel Corporation donated WISPs to academic applicants who proposed compelling WISP-based projects.  The book also includes papers presented at the first WISP Summit, a workshop held in Berkeley, CA in association with the ACM Sensys conference, as well as other relevant papers. The book provides ...

  4. Six Networks on a Universal Neuromorphic Computing Substrate

    Science.gov (United States)

    Pfeil, Thomas; Grübl, Andreas; Jeltsch, Sebastian; Müller, Eric; Müller, Paul; Petrovici, Mihai A.; Schmuker, Michael; Brüderle, Daniel; Schemmel, Johannes; Meier, Karlheinz

    2013-01-01

    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality. PMID:23423583

  5. Computing Path Tables for Quickest Multipaths In Computer Networks

    Energy Technology Data Exchange (ETDEWEB)

    Grimmell, W.C.

    2004-12-21

    We consider the transmission of a message from a source node to a terminal node in a network with n nodes and m links where the message is divided into parts and each part is transmitted over a different path in a set of paths from the source node to the terminal node. Here each link is characterized by a bandwidth and delay. The set of paths together with their transmission rates used for the message is referred to as a multipath. We present two algorithms that produce a minimum-end-to-end message delay multipath path table that, for every message length, specifies a multipath that will achieve the minimum end-to-end delay. The algorithms also generate a function that maps the minimum end-to-end message delay to the message length. The time complexities of the algorithms are O(n{sup 2}((n{sup 2}/logn) + m)min(D{sub max}, C{sub max})) and O(nm(C{sub max} + nmin(D{sub max}, C{sub max}))) when the link delays and bandwidths are non-negative integers. Here D{sub max} and C{sub max} are respectively the maximum link delay and maximum link bandwidth and C{sub max} and D{sub max} are greater than zero.

  6. The Poor Man's Guide to Computer Networks and their Applications

    DEFF Research Database (Denmark)

    Sharp, Robin

    2003-01-01

    These notes for DTU course 02220, Concurrent Programming, give an introduction to computer networks, with focus on the modern Internet. Basic Internet protocols such as IP, TCP and UDP are presented, and two Internet application protocols, SMTP and HTTP, are described in some detail. Techniques f...... for network programming are described, with concrete examples in Java. Techniques considered include simple socket programming, RMI, Corba, and Web services with SOAP....

  7. Large Scale Evolution of Convolutional Neural Networks Using Volunteer Computing

    OpenAIRE

    Desell, Travis

    2017-01-01

    This work presents a new algorithm called evolutionary exploration of augmenting convolutional topologies (EXACT), which is capable of evolving the structure of convolutional neural networks (CNNs). EXACT is in part modeled after the neuroevolution of augmenting topologies (NEAT) algorithm, with notable exceptions to allow it to scale to large scale distributed computing environments and evolve networks with convolutional filters. In addition to multithreaded and MPI versions, EXACT has been ...

  8. COPERITE; Computer-aided tool for power engineering research, instruction, training and education

    Energy Technology Data Exchange (ETDEWEB)

    Chowdhury, B.H.; Clark, D.E. (Wyoming Univ., Laramie, WY (United States). Dept. of Electrical Engineering)

    1992-11-01

    In this paper a graphics-oriented, primarily PC-based tool for education, research and training in power engineering is introduced. The tool called COPERITE has all user interfaces resident on an IBM-386 microcomputer. Menus and windows are used generously for the interface and attractive graphical representations and displays are used. Application programs that are interfaced are power flow, contingency analysis, economic dispatch, security-constrained dispatch, system stability and fault analysis. These programs are executed on a VAX 8800 computer mainly for speed of execution. Information exchange between the PC and the VAX is made through an ethernet connection which is transparent to the user. Results of execution show up on the graphical front-end accessible to the user. COPERITE has a powerful network editor having the capabilities of adding, deleting, moving and finding symbols with a graphics cursor. Provisions are present for building and using artificial intelligence techniques for system operation enhancement.

  9. Efficient Capacity Computation and Power Optimization for Relay Networks

    CERN Document Server

    Parvaresh, Farzad

    2011-01-01

    The capacity or approximations to capacity of various single-source single-destination relay network models has been characterized in terms of the cut-set upper bound. In principle, a direct computation of this bound requires evaluating the cut capacity over exponentially many cuts. We show that the minimum cut capacity of a relay network under some special assumptions can be cast as a minimization of a submodular function, and as a result, can be computed efficiently. We use this result to show that the capacity, or an approximation to the capacity within a constant gap for the Gaussian, wireless erasure, and Avestimehr-Diggavi-Tse deterministic relay network models can be computed in polynomial time. We present some empirical results showing that computing constant-gap approximations to the capacity of Gaussian relay networks with around 300 nodes can be done in order of minutes. For Gaussian networks, cut-set capacities are also functions of the powers assigned to the nodes. We consider a family of power o...

  10. Introduction to Naval Hydrodynamics using Advanced Computational and Experimental Tools

    Science.gov (United States)

    Buchholz, James; Carrica, Pablo; Russell, Jae-Eun; Pontarelli, Matthew; Krebill, Austin; Berdon, Randall

    2017-11-01

    An undergraduate certificate program in naval hydrodynamics has been recently established at the University of Iowa. Despite several decades of graduate research in this area, this is the first formal introduction to naval hydrodynamics for University of Iowa undergraduate students. Central to the curriculum are two new courses that emphasize open-ended projects conducted in a novel laboratory/learning community that exposes students to advanced tools in computational and experimental fluid mechanics, respectively. Learning is pursued in a loosely-structured environment in which students work in small groups to conduct simulations and experiments relating to resistance, propulsion, and seakeeping using a revised version of the naval hydrodynamics research flow solver, REX, and a small towing tank. Survey responses indicate that the curriculum and course format has strongly increased student interest in naval hydrodynamics and effectively facilitated depth of student learning. This work was supported by the Office of Naval Research under Award Number N00014-15-1-2448.

  11. Computer games as a tool for language education

    Directory of Open Access Journals (Sweden)

    Ivan Lombardi

    2012-05-01

    Full Text Available When it comes to examining the diffusion of videogames, and of computer games in particular, outside of a recreational context, the use of this peculiar tool for schooling is certainly one of the most interesting subjects an educator could hope for. In fact, owing to data collected by myself and a growing number of researchers in the field of education (Egenfeldt-Nielsen, 2006; Felicia, 2009; Wastiau, Kearney & VandenBerghe, 2009; Minoli, 2009; Lombardi, 2012, teachers are actually intrigued by the educational potential of digital games, but have no idea how to harness this latent power and/or can’t work out how to accommodate the medium specificities in the school curriculum.

  12. Ecoupling server: A tool to compute and analyze electronic couplings.

    Science.gov (United States)

    Cabeza de Vaca, Israel; Acebes, Sandra; Guallar, Victor

    2016-07-05

    Electron transfer processes are often studied through the evaluation and analysis of the electronic coupling (EC). Since most standard QM codes do not provide readily such a measure, additional, and user-friendly tools to compute and analyze electronic coupling from external wave functions will be of high value. The first server to provide a friendly interface for evaluation and analysis of electronic couplings under two different approximations (FDC and GMH) is presented in this communication. Ecoupling server accepts inputs from common QM and QM/MM software and provides useful plots to understand and analyze the results easily. The web server has been implemented in CGI-python using Apache and it is accessible at http://ecouplingserver.bsc.es. Ecoupling server is free and open to all users without login. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Translation Memory and Computer Assisted Translation Tool for Medieval Texts

    Directory of Open Access Journals (Sweden)

    Törcsvári Attila

    2013-05-01

    Full Text Available Translation memories (TMs, as part of Computer Assisted Translation (CAT tools, support translators reusing portions of formerly translated text. Fencing books are good candidates for using TMs due to the high number of repeated terms. Medieval texts suffer a number of drawbacks that make hard even “simple” rewording to the modern version of the same language. The analyzed difficulties are: lack of systematic spelling, unusual word orders and typos in the original. A hypothesis is made and verified that even simple modernization increases legibility and it is feasible, also it is worthwhile to apply translation memories due to the numerous and even extremely long repeated terms. Therefore, methods and algorithms are presented 1. for automated transcription of medieval texts (when a limited training set is available, and 2. collection of repeated patterns. The efficiency of the algorithms is analyzed for recall and precision.

  14. Building Social Networks with Computer Networks: A New Deal for Teaching and Learning.

    Science.gov (United States)

    Thurston, Thomas

    2001-01-01

    Discusses the role of computer technology and Web sites in expanding social networks. Focuses on the New Deal Network using two examples: (1) uniting a Julia C. Lathrop Housing (Chicago, Illinois) resident with a university professor; and (2) saving the Hugo Gellert art murals at the Seward Park Coop Apartments (New York). (CMK)

  15. Collaborative Tools for e-Participation across Networks: The Comuno Networking Site for Public Governance and Services

    Directory of Open Access Journals (Sweden)

    Michael Kaschesky

    2010-04-01

    Full Text Available This paper presents collaborative tools for public participation across multiple networking sites. The tools are part of the Comuno networking site for public governance and services, which is particularly targeted at the public sector (currently in alpha testing at http://comuno.org. The Broadcast tool allows cross-posting content from Comuno to a wide variety of other networking sites, such as Facebook or Twitter. The UserFeed and TopicFeed tools build RSS feeds from content published by a specific user or under a specific topic. The LifeStream tool gathers a user’s activities across multiple networking sites in the private account section at Comuno. These tools and related aspects of the Comuno networking site are discussed and presented in the context of deliberation and opinion-forming in a Swiss bilingual city.

  16. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  17. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  18. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  19. A local area computer network expert system framework

    Science.gov (United States)

    Dominy, Robert

    1987-01-01

    Over the past years an expert system called LANES designed to detect and isolate faults in the Goddard-wide Hybrid Local Area Computer Network (LACN) was developed. As a result, the need for developing a more generic LACN fault isolation expert system has become apparent. An object oriented approach was explored to create a set of generic classes, objects, rules, and methods that would be necessary to meet this need. The object classes provide a convenient mechanism for separating high level information from low level network specific information. This approach yeilds a framework which can be applied to different network configurations and be easily expanded to meet new needs.

  20. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  1. Test experience on an ultrareliable computer communication network

    Science.gov (United States)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultra-reliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  2. Analytical Computation of the Epidemic Threshold on Temporal Networks

    Directory of Open Access Journals (Sweden)

    Eugenio Valdano

    2015-04-01

    Full Text Available The time variation of contacts in a networked system may fundamentally alter the properties of spreading processes and affect the condition for large-scale propagation, as encoded in the epidemic threshold. Despite the great interest in the problem for the physics, applied mathematics, computer science, and epidemiology communities, a full theoretical understanding is still missing and currently limited to the cases where the time-scale separation holds between spreading and network dynamics or to specific temporal network models. We consider a Markov chain description of the susceptible-infectious-susceptible process on an arbitrary temporal network. By adopting a multilayer perspective, we develop a general analytical derivation of the epidemic threshold in terms of the spectral radius of a matrix that encodes both network structure and disease dynamics. The accuracy of the approach is confirmed on a set of temporal models and empirical networks and against numerical results. In addition, we explore how the threshold changes when varying the overall time of observation of the temporal network, so as to provide insights on the optimal time window for data collection of empirical temporal networked systems. Our framework is of both fundamental and practical interest, as it offers novel understanding of the interplay between temporal networks and spreading dynamics.

  3. Three computational tools for predicting bacterial essential genes.

    Science.gov (United States)

    Guo, Feng-Biao; Ye, Yuan-Nong; Ning, Lu-Wen; Wei, Wen

    2015-01-01

    Essential genes are those genes indispensable for the survival of any living cell. Bacterial essential genes constitute the cornerstones of synthetic biology and are often attractive targets in the development of antibiotics and vaccines. Because identification of essential genes with wet-lab ways often means expensive economic costs and tremendous labor, scientists changed to seek for alternative way of computational prediction. Aiming to help to solve this issue, our research group (CEFG: group of Computational, Comparative, Evolutionary and Functional Genomics, http://cefg.uestc.edu.cn) has constructed three online services to predict essential genes in bacterial genomes. These freely available tools are applicable for single gene sequences without annotated functions, single genes with definite names, and complete genomes of bacterial strains. To ensure reliable predictions, the investigated species should belong to the same family (for EGP) or phylum (for CEG_Match and Geptop) with one of the reference species, respectively. As the pilot software for the issue, predicting accuracies of them have been assessed and compared with existing algorithms, and note that all of other published algorithms have not any formed online services. We hope these services at CEFG will help scientists and researchers in the field of essential genes.

  4. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  5. Identifying failure in a tree network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  6. Regional Computation of TEC Using a Neural Network Model

    Science.gov (United States)

    Leandro, R. F.; Santos, M. C.

    2004-05-01

    One of the main sources of errors of GPS measurements is the ionosphere refraction. As a dispersive medium, the ionosphere allow its influence to be computed by using dual frequency receivers. In the case of single frequency receivers it is necessary to use models that tell us how big the ionospheric refraction is. The GPS broadcast message carries parameters of this model, namely Klobuchar model. Dual frequency receivers allow to estimate the influence of ionosphere in the GPS signal by the computation of TEC (Total Electron Content) values, that have a direct relationship with the magnitude of the delay caused by the ionosphere. One alternative is to create a regional model based on a network of dual frequency receivers. In this case, the regional behaviour of ionosphere is modelled in a way that it is possible to estimate the TEC values into or near this region. This regional model can be based on polynomials, for example. In this work we will present a Neural Network-based model to the regional computation of TEC. The advantage of using a Neural Network is that it is not necessary to have a great knowledge on the behaviour of the modelled surface due to the adaptation capability of neural networks training process, that is an iterative adjust of the synaptic weights in function of residuals, using the training parameters. Therefore, the previous knowledge of the modelled phenomena is important to define what kind of and how many parameters are needed to train the neural network so that reasonable results are obtained from the estimations. We have used data from the GPS tracking network in Brazil, and we have tested the accuracy of the new model to all locations where there is a station, accessing the efficiency of the model everywhere. TEC values were computed for each station of the network. After that the training parameters data set for the test station was formed, with the TEC values of all others (all stations, except the test one). The Neural Network was

  7. Design, Implementation and Optimization of Innovative Internet Access Networks, based on Fog Computing and Software Defined Networking

    OpenAIRE

    Iotti, Nicola

    2017-01-01

    1. DESIGN In this dissertation we introduce a new approach to Internet access networks in public spaces, such as Wi-Fi network commonly known as Hotspot, based on Fog Computing (or Edge Computing), Software Defined Networking (SDN) and the deployment of Virtual Machines (VM) and Linux containers, on the edge of the network. In this vision we deploy specialized network elements, called Fog Nodes, on the edge of the network, able to virtualize the physical infrastructure and expose APIs to e...

  8. Small-world networks in neuronal populations: a computational perspective.

    Science.gov (United States)

    Zippo, Antonio G; Gelsomino, Giuliana; Van Duin, Pieter; Nencini, Sara; Caramenti, Gian Carlo; Valente, Maurizio; Biella, Gabriele E M

    2013-08-01

    The analysis of the brain in terms of integrated neural networks may offer insights on the reciprocal relation between structure and information processing. Even with inherent technical limits, many studies acknowledge neuron spatial arrangements and communication modes as key factors. In this perspective, we investigated the functional organization of neuronal networks by explicitly assuming a specific functional topology, the small-world network. We developed two different computational approaches. Firstly, we asked whether neuronal populations actually express small-world properties during a definite task, such as a learning task. For this purpose we developed the Inductive Conceptual Network (ICN), which is a hierarchical bio-inspired spiking network, capable of learning invariant patterns by using variable-order Markov models implemented in its nodes. As a result, we actually observed small-world topologies during learning in the ICN. Speculating that the expression of small-world networks is not solely related to learning tasks, we then built a de facto network assuming that the information processing in the brain may occur through functional small-world topologies. In this de facto network, synchronous spikes reflected functional small-world network dependencies. In order to verify the consistency of the assumption, we tested the null-hypothesis by replacing the small-world networks with random networks. As a result, only small world networks exhibited functional biomimetic characteristics such as timing and rate codes, conventional coding strategies and neuronal avalanches, which are cascades of bursting activities with a power-law distribution. Our results suggest that small-world functional configurations are liable to underpin brain information processing at neuronal level. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. DANNP: an efficient artificial neural network pruning tool

    Directory of Open Access Journals (Sweden)

    Mona Alshahrani

    2017-11-01

    Full Text Available Background Artificial neural networks (ANNs are a robust class of machine learning models and are a frequent choice for solving classification problems. However, determining the structure of the ANNs is not trivial as a large number of weights (connection links may lead to overfitting the training data. Although several ANN pruning algorithms have been proposed for the simplification of ANNs, these algorithms are not able to efficiently cope with intricate ANN structures required for complex classification problems. Methods We developed DANNP, a web-based tool, that implements parallelized versions of several ANN pruning algorithms. The DANNP tool uses a modified version of the Fast Compressed Neural Network software implemented in C++ to considerably enhance the running time of the ANN pruning algorithms we implemented. In addition to the performance evaluation of the pruned ANNs, we systematically compared the set of features that remained in the pruned ANN with those obtained by different state-of-the-art feature selection (FS methods. Results Although the ANN pruning algorithms are not entirely parallelizable, DANNP was able to speed up the ANN pruning up to eight times on a 32-core machine, compared to the serial implementations. To assess the impact of the ANN pruning by DANNP tool, we used 16 datasets from different domains. In eight out of the 16 datasets, DANNP significantly reduced the number of weights by 70%–99%, while maintaining a competitive or better model performance compared to the unpruned ANN. Finally, we used a naïve Bayes classifier derived with the features selected as a byproduct of the ANN pruning and demonstrated that its accuracy is comparable to those obtained by the classifiers trained with the features selected by several state-of-the-art FS methods. The FS ranking methodology proposed in this study allows the users to identify the most discriminant features of the problem at hand. To the best of our knowledge

  10. DANNP: an efficient artificial neural network pruning tool

    KAUST Repository

    Alshahrani, Mona

    2017-11-06

    Background Artificial neural networks (ANNs) are a robust class of machine learning models and are a frequent choice for solving classification problems. However, determining the structure of the ANNs is not trivial as a large number of weights (connection links) may lead to overfitting the training data. Although several ANN pruning algorithms have been proposed for the simplification of ANNs, these algorithms are not able to efficiently cope with intricate ANN structures required for complex classification problems. Methods We developed DANNP, a web-based tool, that implements parallelized versions of several ANN pruning algorithms. The DANNP tool uses a modified version of the Fast Compressed Neural Network software implemented in C++ to considerably enhance the running time of the ANN pruning algorithms we implemented. In addition to the performance evaluation of the pruned ANNs, we systematically compared the set of features that remained in the pruned ANN with those obtained by different state-of-the-art feature selection (FS) methods. Results Although the ANN pruning algorithms are not entirely parallelizable, DANNP was able to speed up the ANN pruning up to eight times on a 32-core machine, compared to the serial implementations. To assess the impact of the ANN pruning by DANNP tool, we used 16 datasets from different domains. In eight out of the 16 datasets, DANNP significantly reduced the number of weights by 70%–99%, while maintaining a competitive or better model performance compared to the unpruned ANN. Finally, we used a naïve Bayes classifier derived with the features selected as a byproduct of the ANN pruning and demonstrated that its accuracy is comparable to those obtained by the classifiers trained with the features selected by several state-of-the-art FS methods. The FS ranking methodology proposed in this study allows the users to identify the most discriminant features of the problem at hand. To the best of our knowledge, DANNP (publicly

  11. Teaching Students How to Integrate and Assess Social Networking Tools in Marketing Communications

    Science.gov (United States)

    Schlee, Regina Pefanis; Harich, Katrin R.

    2013-01-01

    This research is based on two studies that focus on teaching students how to integrate and assess social networking tools in marketing communications. Study 1 examines how students in marketing classes utilize social networking tools and explores their attitudes regarding the use of such tools for marketing communications. Study 2 focuses on an…

  12. REMOD: a computational tool for remodeling neuronal dendrites

    Directory of Open Access Journals (Sweden)

    Panagiotis Bozelos

    2014-05-01

    Full Text Available In recent years, several modeling studies have indicated that dendritic morphology is a key determinant of how individual neurons acquire a unique signal processing profile. The highly branched dendritic structure that originates from the cell body, explores the surrounding 3D space in a fractal-like manner, until it reaches a certain amount of complexity. Its shape undergoes significant alterations not only in various neuropathological conditions, but in physiological, too. Yet, despite the profound effect that these alterations can have on neuronal function, the causal relationship between structure and function remains largely elusive. The lack of a systematic approach for remodeling neuronal cells and their dendritic trees is a key limitation that contributes to this problem. In this context, we developed a computational tool that allows the remodeling of any type of neurons, given a set of exemplar morphologies. The tool is written in Python and provides a simple GUI that guides the user through various options to manipulate selected neuronal morphologies. It provides the ability to load one or more morphology files (.swc or .hoc and choose specific dendrites to operate one of the following actions: shrink, remove, extend or branch (as shown in Figure 1. The user retains complete control over the extent of each alteration and if a chosen action is not possible due to pre-existing structural constraints, appropriate warnings are produced. Importantly, the tool can also be used to extract morphology statistics for one or multiple morphologies, including features such as the total dendritic length, path length to the root, branch order, diameter tapering, etc. Finally, an experimental utility enables the user to remodel entire dendritic trees based on preloaded statistics from a database of cell-type specific neuronal morphologies. To our knowledge, this is the first tool that allows (a the remodeling of existing –as opposed to the de novo

  13. Web based educational tool for neural network robot control

    Directory of Open Access Journals (Sweden)

    Jure Čas

    2007-05-01

    Full Text Available Abstract— This paper describes the application for teleoperations of the SCARA robot via the internet. The SCARA robot is used by students of mehatronics at the University of Maribor as a remote educational tool. The developed software consists of two parts i.e. the continuous neural network sliding mode controller (CNNSMC and the graphical user interface (GUI. Application is based on two well-known commercially available software packages i.e. MATLAB/Simulink and LabVIEW. Matlab/Simulink and the DSP2 Library for Simulink are used for control algorithm development, simulation and executable code generation. While this code is executing on the DSP-2 Roby controller and through the analog and digital I/O lines drives the real process, LabVIEW virtual instrument (VI, running on the PC, is used as a user front end. LabVIEW VI provides the ability for on-line parameter tuning, signal monitoring, on-line analysis and via Remote Panels technology also teleoperation. The main advantage of a CNNSMC is the exploitation of its self-learning capability. When friction or an unexpected impediment occurs for example, the user of a remote application has no information about any changed robot dynamic and thus is unable to dispatch it manually. This is not a control problem anymore because, when a CNNSMC is used, any approximation of changed robot dynamic is estimated independently of the remote’s user. Index Terms—LabVIEW; Matlab/Simulink; Neural network control; remote educational tool; robotics

  14. A computational study of routing algorithms for realistic transportation networks

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, R.; Marathe, M.V.; Nagel, K.

    1998-12-01

    The authors carry out an experimental analysis of a number of shortest path (routing) algorithms investigated in the context of the TRANSIMS (Transportation Analysis and Simulation System) project. The main focus of the paper is to study how various heuristic and exact solutions, associated data structures affected the computational performance of the software developed especially for realistic transportation networks. For this purpose the authors have used Dallas Fort-Worth road network with very high degree of resolution. The following general results are obtained: (1) they discuss and experimentally analyze various one-one shortest path algorithms, which include classical exact algorithms studied in the literature as well as heuristic solutions that are designed to take into account the geometric structure of the input instances; (2) they describe a number of extensions to the basic shortest path algorithm. These extensions were primarily motivated by practical problems arising in TRANSIMS and ITS (Intelligent Transportation Systems) related technologies. Extensions discussed include--(i) time dependent networks, (ii) multi-modal networks, (iii) networks with public transportation and associated schedules. Computational results are provided to empirically compare the efficiency of various algorithms. The studies indicate that a modified Dijkstra`s algorithm is computationally fast and an excellent candidate for use in various transportation planning applications as well as ITS related technologies.

  15. Improving a Computer Networks Course Using the Partov Simulation Engine

    Science.gov (United States)

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  16. Fish species recognition using computer vision and a neural network

    NARCIS (Netherlands)

    Storbeck, F.; Daan, B.

    2001-01-01

    A system is described to recognize fish species by computer vision and a neural network program. The vision system measures a number of features of fish as seen by a camera perpendicular to a conveyor belt. The features used here are the widths and heights at various locations along the fish. First

  17. Computing Nash Equilibrium in Wireless Ad Hoc Networks

    DEFF Research Database (Denmark)

    Bulychev, Peter E.; David, Alexandre; Larsen, Kim G.

    2012-01-01

    This paper studies the problem of computing Nash equilibrium in wireless networks modeled by Weighted Timed Automata. Such formalism comes together with a logic that can be used to describe complex features such as timed energy constraints. Our contribution is a method for solving this problem...

  18. High Performance Computing and Networking for Science--Background Paper.

    Science.gov (United States)

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    The Office of Technology Assessment is conducting an assessment of the effects of new information technologies--including high performance computing, data networking, and mass data archiving--on research and development. This paper offers a view of the issues and their implications for current discussions about Federal supercomputer initiatives…

  19. An Analysis of Attitudes toward Computer Networks and Internet Addiction.

    Science.gov (United States)

    Tsai, Chin-Chung; Lin, Sunny S. J.

    The purpose of this study was to explore the interplay between young people's attitudes toward computer networks and Internet addiction. After analyzing questionnaire responses of an initial sample of 615 Taiwanese high school students, 78 subjects, viewed as possible Internet addicts, were selected for further explorations. It was found that…

  20. A Three-Dimensional Computational Model of Collagen Network Mechanics

    Science.gov (United States)

    Lee, Byoungkoo; Zhou, Xin; Riching, Kristin; Eliceiri, Kevin W.; Keely, Patricia J.; Guelcher, Scott A.; Weaver, Alissa M.; Jiang, Yi

    2014-01-01

    Extracellular matrix (ECM) strongly influences cellular behaviors, including cell proliferation, adhesion, and particularly migration. In cancer, the rigidity of the stromal collagen environment is thought to control tumor aggressiveness, and collagen alignment has been linked to tumor cell invasion. While the mechanical properties of collagen at both the single fiber scale and the bulk gel scale are quite well studied, how the fiber network responds to local stress or deformation, both structurally and mechanically, is poorly understood. This intermediate scale knowledge is important to understanding cell-ECM interactions and is the focus of this study. We have developed a three-dimensional elastic collagen fiber network model (bead-and-spring model) and studied fiber network behaviors for various biophysical conditions: collagen density, crosslinker strength, crosslinker density, and fiber orientation (random vs. prealigned). We found the best-fit crosslinker parameter values using shear simulation tests in a small strain region. Using this calibrated collagen model, we simulated both shear and tensile tests in a large linear strain region for different network geometry conditions. The results suggest that network geometry is a key determinant of the mechanical properties of the fiber network. We further demonstrated how the fiber network structure and mechanics evolves with a local formation, mimicking the effect of pulling by a pseudopod during cell migration. Our computational fiber network model is a step toward a full biomechanical model of cellular behaviors in various ECM conditions. PMID:25386649

  1. The nitrogen footprint tool network: a multi-institution program ...

    Science.gov (United States)

    Anthropogenic sources of reactive nitrogen have local and global impacts on air and water quality and detrimental effects on human and ecosystem health. This paper uses the nitrogen footprint tool (NFT) to determine the amount of nitrogen (N) released as a result of institutional consumption. The sectors accounted for include food (consumption and the upstream production), energy, transportation, fertilizer, research animals, and agricultural research. The NFT is then used for scenario analysis to manage and track reductions to institution N footprints, which are driven by the consumption behaviors of both the institution itself and its constituent individuals. In this paper, the first seven institution N footprint results are presented. The institution NFT network aims to develop footprints for many institutions to encourage widespread upper-level management strategies that will create significant reductions in reactive N released to the environment. Energy use and food purchases are the two largest contributors to institution N footprints. Ongoing efforts by institutions to reduce greenhouse gas emissions also help to reduce the N footprint, but the impact of food production on N pollution has not been directly addressed by the higher-ed sustainability community. The NFT Network found that institutions could reduce their N footprints by optimizing food purchasing to reduce consumption of animal products and minimize food waste, as well as reducing dependence o

  2. Computer network time synchronization the network time protocol on earth and in space

    CERN Document Server

    Mills, David L

    2010-01-01

    Carefully coordinated, reliable, and accurate time synchronization is vital to a wide spectrum of fields-from air and ground traffic control, to buying and selling goods and services, to TV network programming. Ill-gotten time could even lead to the unimaginable and cause DNS caches to expire, leaving the entire Internet to implode on the root servers.Written by the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol on Earth and in Space, Second Edition addresses the technological infrastructure of time dissemination, distrib

  3. Cloud computing: An innovative tool for library services

    OpenAIRE

    Sahu, R.

    2015-01-01

    Cloud computing is a new technique of information communication technology because of its potential benefits such as reduced cost, accessible anywhere any time as well as its elasticity and flexibility. In this Paper defines cloud Computing, definition, essential characteristics, model of cloud computing, components of cloud, advantages & drawbacks of cloud computing and also describe cloud computing in libraries.

  4. Computation emerges from adaptive synchronization of networking neurons.

    Directory of Open Access Journals (Sweden)

    Massimiliano Zanin

    Full Text Available The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain.

  5. Synchronization-based computation through networks of coupled oscillators

    Directory of Open Access Journals (Sweden)

    Daniel eMalagarriga

    2015-08-01

    Full Text Available The mesoscopic activity of the brain is strongly dynamical, while at the sametime exhibiting remarkable computational capabilities. In order to examinehow these two features coexist, here we show that the patterns of synchronizedoscillations displayed by networks of neural mass models, representing cortical columns, can be usedas substrates for Boolean computation. Our results reveal that different logicaloperations can be implemented by the same neural mass network at different timesfollowing the dynamics of the input. The results are reproduced experimentallywith electronic circuits of coupled Chua oscillators, showing the robustness of this kind of computation to the intrinsic noise and parameter mismatch of the oscillators responsible for the functioning of the gates. We also show that theinformation-processing capabilities of coupled oscillations go beyond thesimple juxtaposition of logic gates.

  6. Validation of RetroPath, a computer-aided design tool for metabolic pathway engineering.

    Science.gov (United States)

    Fehér, Tamás; Planson, Anne-Gaëlle; Carbonell, Pablo; Fernández-Castané, Alfred; Grigoras, Ioana; Dariy, Ekaterina; Perret, Alain; Faulon, Jean-Loup

    2014-11-01

    Metabolic engineering has succeeded in biosynthesis of numerous commodity or high value compounds. However, the choice of pathways and enzymes used for production was many times made ad hoc, or required expert knowledge of the specific biochemical reactions. In order to rationalize the process of engineering producer strains, we developed the computer-aided design (CAD) tool RetroPath that explores and enumerates metabolic pathways connecting the endogenous metabolites of a chassis cell to the target compound. To experimentally validate our tool, we constructed 12 top-ranked enzyme combinations producing the flavonoid pinocembrin, four of which displayed significant yields. Namely, our tool queried the enzymes found in metabolic databases based on their annotated and predicted activities. Next, it ranked pathways based on the predicted efficiency of the available enzymes, the toxicity of the intermediate metabolites and the calculated maximum product flux. To implement the top-ranking pathway, our procedure narrowed down a list of nine million possible enzyme combinations to 12, a number easily assembled and tested. One round of metabolic network optimization based on RetroPath output further increased pinocembrin titers 17-fold. In total, 12 out of the 13 enzymes tested in this work displayed a relative performance that was in accordance with its predicted score. These results validate the ranking function of our CAD tool, and open the way to its utilization in the biosynthesis of novel compounds. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  8. Advances in neural networks computational and theoretical issues

    CERN Document Server

    Esposito, Anna; Morabito, Francesco

    2015-01-01

    This book collects research works that exploit neural networks and machine learning techniques from a multidisciplinary perspective. Subjects covered include theoretical, methodological and computational topics which are grouped together into chapters devoted to the discussion of novelties and innovations related to the field of Artificial Neural Networks as well as the use of neural networks for applications, pattern recognition, signal processing, and special topics such as the detection and recognition of multimodal emotional expressions and daily cognitive functions, and  bio-inspired memristor-based networks.  Providing insights into the latest research interest from a pool of international experts coming from different research fields, the volume becomes valuable to all those with any interest in a holistic approach to implement believable, autonomous, adaptive, and context-aware Information Communication Technologies.

  9. Connect the dot: Computing feed-links for network extension

    Directory of Open Access Journals (Sweden)

    Boris Aronov

    2011-12-01

    Full Text Available Road network analysis can require distance from points that are not on the network themselves. We study the algorithmic problem of connecting a point inside a face (region of the road network to its boundary while minimizing the detour factor of that point to any point on the boundary of the face. We show that the optimal single connection (feed-link can be computed in O(lambda_7(n log n time, where n is the number of vertices that bounds the face and lambda_7(n is the slightly superlinear maximum length of a Davenport-Schinzel sequence of order 7 on n symbols. We also present approximation results for placing more feed-links, deal with the case that there are obstacles in the face of the road network that contains the point to be connected, and present various related results.

  10. Computational modeling of signal transduction networks: a pedagogical exposition.

    Science.gov (United States)

    Prasad, Ashok

    2012-01-01

    We give a pedagogical introduction to computational modeling of signal transduction networks, starting from explaining the representations of chemical reactions by differential equations via the law of mass action. We discuss elementary biochemical reactions such as Michaelis-Menten enzyme kinetics and cooperative binding, and show how these allow the representation of large networks as systems of differential equations. We discuss the importance of looking for simpler or reduced models, such as network motifs or dynamical motifs within the larger network, and describe methods to obtain qualitative behavior by bifurcation analysis, using freely available continuation software. We then discuss stochastic kinetics and show how to implement easy-to-use methods of rule-based modeling for stochastic simulations. We finally suggest some methods for comprehensive parameter sensitivity analysis, and discuss the insights that it could yield. Examples, including code to try out, are provided based on a paper that modeled Ras kinetics in thymocytes.

  11. A computational method based on CVSS for quantifying the vulnerabilities in computer network

    Directory of Open Access Journals (Sweden)

    Shahriyar Mohammadi

    2014-10-01

    Full Text Available Network vulnerability taxonomy has become increasingly important in the area of information and data exchange not only for its potential use in identification of vulnerabilities but also in their assessment and prioritization. Computer networks play an important role in information and communication infrastructure. However, they are constantly exposed to a variety of vulnerability risks. In their attempts to create secure information exchange systems, scientists have concentrated on understanding the nature and typology of these vulnerabilities. Their efforts aimed at establishing secure networks have led to the development of a variety of methods and techniques for quantifying vulnerability. The objective of the present paper is developing a method based on the second edition of common vulnerability scoring system (CVSS for the quantification of Computer Network vulnerabilities. It is expected that the proposed model will help in the identification and effective management of vulnerabilities by their quantification.

  12. Computing and Network Systems Administration, Operations Research, and System Dynamics Modeling: A Proposed Research Framework

    Directory of Open Access Journals (Sweden)

    Michael W. Totaro

    2016-12-01

    Full Text Available Information and computing infrastructures (ICT involve levels of complexity that are highly dynamic in nature. This is due in no small measure to the proliferation of technologies, such as: cloud computing and distributed systems architectures, data mining and multidimensional analysis, and large scale enterprise systems, to name a few. Effective computing and network systems administration is integral to the stability and scalability of these complex software, hardware and communication systems. Systems administration involves the design, analysis, and continuous improvement of the performance or operation of information and computing systems. Additionally, social and administrative responsibilities have become nearly as integral for the systems administrator as are the technical demands that have been imposed for decades. The areas of operations research (OR and system dynamics (SD modeling offer system administrators a rich array of analytical and optimization tools that have been developed from diverse disciplines, which include: industrial, scientific, engineering, economic and financial, to name a few. This paper proposes a research framework by which OR and SD modeling techniques may prove useful to computing and network systems administration, which include: linear programming, network analysis, integer programming, nonlinear optimization, Markov processes, queueing modeling, simulation, decision analysis, heuristic techniques, and system dynamics modeling.

  13. Joint refinement of FRET measurements using spectroscopic and computational tools.

    Science.gov (United States)

    Kyrychenko, Alexander; Rodnin, Mykola V; Ghatak, Chiranjib; Ladokhin, Alexey S

    2017-04-01

    The variability of the orientation factor is a long-standing challenge in converting FRET efficiency measurements into donor-acceptor distances. We propose the use of molecular dynamics (MD) simulations to characterize orientation distributions and thus improve the accuracy of distance measurements. Here, we test this approach by comparing experimental and simulated FRET efficiencies for a model donor-acceptor pair of enhanced cyan and enhanced yellow FPs connected by a flexible linker. Several spectroscopic techniques were used to characterize FRET in solution. In addition, a series of atomistic MD simulations of a total length of 1.5 μs were carried out to calculate the distances and the orientation factor in the FRET-pair. The resulting MD-based and experimentally measured FRET efficiency histograms coincided with each other, allowing for direct comparison of distance distributions. Despite the fact that the calculated average orientation factor was close to 2/3, the application of the average κ(2) to the entire histogram of FRET efficiencies resulted in a substantial artificial broadening of the calculated distribution of apparent donor-acceptor distances. By combining single pair-FRET measurements with computational tools, we demonstrate that accounting for the donor and acceptor orientation heterogeneity is critical for accurate representation of the donor-acceptor distance distribution from FRET measurements. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Applying DNA computation to intractable problems in social network analysis.

    Science.gov (United States)

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  15. Use Computer-Aided Tools to Parallelize Large CFD Applications

    Science.gov (United States)

    Jin, H.; Frumkin, M.; Yan, J.

    2000-01-01

    Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of

  16. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    Science.gov (United States)

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  17. A modular architecture for transparent computation in recurrent neural networks.

    Science.gov (United States)

    Carmantini, Giovanni S; Beim Graben, Peter; Desroches, Mathieu; Rodrigues, Serafim

    2017-01-01

    Computation is classically studied in terms of automata, formal languages and algorithms; yet, the relation between neural dynamics and symbolic representations and operations is still unclear in traditional eliminative connectionism. Therefore, we suggest a unique perspective on this central issue, to which we would like to refer as transparent connectionism, by proposing accounts of how symbolic computation can be implemented in neural substrates. In this study we first introduce a new model of dynamics on a symbolic space, the versatile shift, showing that it supports the real-time simulation of a range of automata. We then show that the Gödelization of versatile shifts defines nonlinear dynamical automata, dynamical systems evolving on a vectorial space. Finally, we present a mapping between nonlinear dynamical automata and recurrent artificial neural networks. The mapping defines an architecture characterized by its granular modularity, where data, symbolic operations and their control are not only distinguishable in activation space, but also spatially localizable in the network itself, while maintaining a distributed encoding of symbolic representations. The resulting networks simulate automata in real-time and are programmed directly, in the absence of network training. To discuss the unique characteristics of the architecture and their consequences, we present two examples: (i) the design of a Central Pattern Generator from a finite-state locomotive controller, and (ii) the creation of a network simulating a system of interactive automata that supports the parsing of garden-path sentences as investigated in psycholinguistics experiments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Review On Applications Of Neural Network To Computer Vision

    Science.gov (United States)

    Li, Wei; Nasrabadi, Nasser M.

    1989-03-01

    Neural network models have many potential applications to computer vision due to their parallel structures, learnability, implicit representation of domain knowledge, fault tolerance, and ability of handling statistical data. This paper demonstrates the basic principles, typical models and their applications in this field. Variety of neural models, such as associative memory, multilayer back-propagation perceptron, self-stabilized adaptive resonance network, hierarchical structured neocognitron, high order correlator, network with gating control and other models, can be applied to visual signal recognition, reinforcement, recall, stereo vision, motion, object tracking and other vision processes. Most of the algorithms have been simulated on com-puters. Some have been implemented with special hardware. Some systems use features, such as edges and profiles, of images as the data form for input. Other systems use raw data as input signals to the networks. We will present some novel ideas contained in these approaches and provide a comparison of these methods. Some unsolved problems are mentioned, such as extracting the intrinsic properties of the input information, integrating those low level functions to a high-level cognitive system, achieving invariances and other problems. Perspectives of applications of some human vision models and neural network models are analyzed.

  19. Analysis of Intrusion Detection and Attack Proliferation in Computer Networks

    Science.gov (United States)

    Rangan, Prahalad; Knuth, Kevin H.

    2007-11-01

    One of the popular models to describe computer worm propagation is the Susceptible-Infected (SI) model [1]. This model of worm propagation has been implemented on the simulation toolkit Network Simulator v2 (ns-2) [2]. The ns-2 toolkit has the capability to simulate networks of different topologies. The topology studied in this work, however, is that of a simple star-topology. This work introduces our initial efforts to learn the relevant quantities describing an infection given synthetic data obtained from running the ns-2 worm model. We aim to use Bayesian methods to gain a predictive understanding of how computer infections spread in real world network topologies. This understanding would greatly reinforce dissemination of targeted immunization strategies, which may prevent real-world epidemics. The data consist of reports of infection from a subset of nodes in a large network during an attack. The infection equation obtained from [1] enables us to derive a likelihood function for the infection reports. This prior information can be used in the Bayesian framework to obtain the posterior probabilities for network properties of interest, such as the rate at which nodes contact one another (also referred to as contact rate or scan rate). Our preliminary analyses indicate an effective spread rate of only 1/5th the actual scan rate used for a star-type of topology. This implies that as the population becomes saturated with infected nodes the actual spread rate will become much less than the scan rate used in the simulation.

  20. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    In modernity, an individual identity was constituted from civil society, while in a globalized network society, human identity, if it develops at all, must grow from communal resistance. A communal resistance to an abstract conceptualized world, where there is no possibility for perception...... in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...

  1. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  2. NETWORKING TOOLS FOR SHOPPING MALLS: HOW TO IMPLEMENT THEM AND MEASURE THEIR EFFECTIVENESS

    Directory of Open Access Journals (Sweden)

    Evgeniya Vasilevna Elistratova

    2017-10-01

    Full Text Available Despite the importance of shopping malls for the contemporary society and their network nature there is no algorithmic basis for support of implementation of networking tools in the business activity of shopping malls. The goal of the present paper is to develop this basis. The paper contains an algorithm of implementation of networking tools which includes five stages. For each stage detailed comments and recommendations are given. A list of indexes of effectiveness of networking tools is proposed. A method for evaluation of the effectiveness of measures of implementation of networking tools is described. A diagnostic matrix which can be used for evaluation of the effectiveness of the company during the proccess of implementation of networking tools is given.

  3. Smart photonic networks and computer security for image data

    Science.gov (United States)

    Campello, Jorge; Gill, John T.; Morf, Martin; Flynn, Michael J.

    1998-02-01

    Work reported here is part of a larger project on 'Smart Photonic Networks and Computer Security for Image Data', studying the interactions of coding and security, switching architecture simulations, and basic technologies. Coding and security: coding methods that are appropriate for data security in data fusion networks were investigated. These networks have several characteristics that distinguish them form other currently employed networks, such as Ethernet LANs or the Internet. The most significant characteristics are very high maximum data rates; predominance of image data; narrowcasting - transmission of data form one source to a designated set of receivers; data fusion - combining related data from several sources; simple sensor nodes with limited buffering. These characteristics affect both the lower level network design and the higher level coding methods.Data security encompasses privacy, integrity, reliability, and availability. Privacy, integrity, and reliability can be provided through encryption and coding for error detection and correction. Availability is primarily a network issue; network nodes must be protected against failure or routed around in the case of failure. One of the more promising techniques is the use of 'secret sharing'. We consider this method as a special case of our new space-time code diversity based algorithms for secure communication. These algorithms enable us to exploit parallelism and scalable multiplexing schemes to build photonic network architectures. A number of very high-speed switching and routing architectures and their relationships with very high performance processor architectures were studied. Indications are that routers for very high speed photonic networks can be designed using the very robust and distributed TCP/IP protocol, if suitable processor architecture support is available.

  4. Condor-COPASI: high-throughput computing for biochemical networks

    Directory of Open Access Journals (Sweden)

    Kent Edward

    2012-07-01

    Full Text Available Abstract Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage.

  5. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  6. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...... for Information Processing (IFIP) and held in Geneva, Switzerland in August 1998. Since the first HCC conference in 1974, IFIP's Technical Committee 9 has endeavoured to set the agenda for human choices and human actions vis-a-vis computers....

  7. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  8. Larvicidal activity prediction againstAedes aegyptimosquito using computational tools.

    Science.gov (United States)

    Cañizares-Carmenate, Yudith; Hernandez-Morfa, Mirelys; Torrens, Francisco; Castellano, Gloria; Castillo-Garit, Juan A

    2017-01-01

    Aedes aegypti is an important vector for transmission of dengue, yellow fever, chikun- gunya, arthritis, and Zika fever. According to the World Health Organization, it is estimated that Ae. aegypti causes 50 million infections and 25,000 deaths per year. Use of larvicidal agents is one of the recommendations of health organizations to control mosquito populations and limit their distribution. The aim of present study was to deduce a mathematical model to predict the larvicidal action of chemical compounds, based on their structure. A series of different compounds with experimental evidence of larvicidal activity were selected to develop a predictive model, using multiple linear regression and a genetic algorithm for the selection of variables, implemented in the QSARINS software. The model was assessed and validated using the OECDs principles. The best model showed good value for the determination coefficient (R2 = 0.752), and others parameters were appropriate for fitting (s = 0.278 and RMSEtr = 0.261). The validation results confirmed that the model hasgood robustness (Q2LOO = 0.682) and stability (R2-Q2LOO = 0.070) with low correlation between the descriptors (KXX = 0.241), an excellent predictive power (R2 ext = 0.834) and was product of a non-random correlation R2 Y-scr = 0.100). The present model shows better parameters than the models reported earlier in the literature, using the same dataset, indicating that the proposed computational tools are more efficient in identifying novel larvicidal compounds against Ae. aegypti.

  9. Multi-objective optimization in computer networks using metaheuristics

    CERN Document Server

    Donoso, Yezid

    2007-01-01

    Metaheuristics are widely used to solve important practical combinatorial optimization problems. Many new multicast applications emerging from the Internet-such as TV over the Internet, radio over the Internet, and multipoint video streaming-require reduced bandwidth consumption, end-to-end delay, and packet loss ratio. It is necessary to design and to provide for these kinds of applications as well as for those resources necessary for functionality. Multi-Objective Optimization in Computer Networks Using Metaheuristics provides a solution to the multi-objective problem in routing computer networks. It analyzes layer 3 (IP), layer 2 (MPLS), and layer 1 (GMPLS and wireless functions). In particular, it assesses basic optimization concepts, as well as several techniques and algorithms for the search of minimals; examines the basic multi-objective optimization concepts and the way to solve them through traditional techniques and through several metaheuristics; and demonstrates how to analytically model the compu...

  10. Advances in neural networks computational intelligence for ICT

    CERN Document Server

    Esposito, Anna; Morabito, Francesco; Pasero, Eros

    2016-01-01

    This carefully edited book is putting emphasis on computational and artificial intelligent methods for learning and their relative applications in robotics, embedded systems, and ICT interfaces for psychological and neurological diseases. The book is a follow-up of the scientific workshop on Neural Networks (WIRN 2015) held in Vietri sul Mare, Italy, from the 20th to the 22nd of May 2015. The workshop, at its 27th edition became a traditional scientific event that brought together scientists from many countries, and several scientific disciplines. Each chapter is an extended version of the original contribution presented at the workshop, and together with the reviewers’ peer revisions it also benefits from the live discussion during the presentation. The content of book is organized in the following sections. 1. Introduction, 2. Machine Learning, 3. Artificial Neural Networks: Algorithms and models, 4. Intelligent Cyberphysical and Embedded System, 5. Computational Intelligence Methods for Biomedical ICT in...

  11. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    their lives in a diversity of social and cultural contexts. In so doing, the book tries to imagine in what kind of networks humans may choose and act based on the knowledge and empirical evidence presented in the papers. The topics covered in the book include: people and their changing values; citizens...... in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...... for Information Processing (IFIP) and held in Geneva, Switzerland in August 1998. Since the first HCC conference in 1974, IFIP's Technical Committee 9 has endeavoured to set the agenda for human choices and human actions vis-a-vis computers....

  12. CONCEPTUAL GENERALIZATION OF STRUCTURAL ORGANIZATION OF COMPUTER NETWORKS MEDICAL SCHOOL

    Directory of Open Access Journals (Sweden)

    O. P. Mintser

    2014-01-01

    Full Text Available The basic principles of the structural organization of computer networks in schools are presented. The questions of universities integration’s in the modern infrastructure of the information society are justified. Details the structural organizations of computer networks are presented. The effectiveness of implementing automated library information systems is shown. The big dynamical growths of technical and personal readiness of students to use virtual educational space are presented. In this regard, universities are required to provide advance information on filling the educational environment of modern virtual university, including multimedia resources for industry professional education programs. Based on information and educational environments virtual representations of universities should be formed distributed resource centers that will avoid duplication of effort on the development of innovative educational technologies, will provide a mutual exchange of results and further development of an open continuous professional education, providing accessibility, modularity and mobility training and retraining specialists.

  13. Novel Screening Tool for Stroke Using Artificial Neural Network.

    Science.gov (United States)

    Abedi, Vida; Goyal, Nitin; Tsivgoulis, Georgios; Hosseinichimeh, Niyousha; Hontecillas, Raquel; Bassaganya-Riera, Josep; Elijovich, Lucas; Metter, Jeffrey E; Alexandrov, Anne W; Liebeskind, David S; Alexandrov, Andrei V; Zand, Ramin

    2017-06-01

    The timely diagnosis of stroke at the initial examination is extremely important given the disease morbidity and narrow time window for intervention. The goal of this study was to develop a supervised learning method to recognize acute cerebral ischemia (ACI) and differentiate that from stroke mimics in an emergency setting. Consecutive patients presenting to the emergency department with stroke-like symptoms, within 4.5 hours of symptoms onset, in 2 tertiary care stroke centers were randomized for inclusion in the model. We developed an artificial neural network (ANN) model. The learning algorithm was based on backpropagation. To validate the model, we used a 10-fold cross-validation method. A total of 260 patients (equal number of stroke mimics and ACIs) were enrolled for the development and validation of our ANN model. Our analysis indicated that the average sensitivity and specificity of ANN for the diagnosis of ACI based on the 10-fold cross-validation analysis was 80.0% (95% confidence interval, 71.8-86.3) and 86.2% (95% confidence interval, 78.7-91.4), respectively. The median precision of ANN for the diagnosis of ACI was 92% (95% confidence interval, 88.7-95.3). Our results show that ANN can be an effective tool for the recognition of ACI and differentiation of ACI from stroke mimics at the initial examination. © 2017 American Heart Association, Inc.

  14. e-Dermatology: social networks and other web based tools.

    Science.gov (United States)

    Taberner, R

    2016-03-01

    The use by patients of social networking sites and the Internet to look for health related information has already become an everyday phenomenon. If, as dermatologists, we want to be part of this new conversation and provide quality content, we will have to adapt to digital media and find new ways of communicating with both our patients and our colleagues. Dozens of Spanish dermatologists have already ventured into the online space and have begun to provide important content through blogs, which they also disseminate via the social media. However, the use of these new technologies can also pose certain risks from the standpoint of ethics and our codes of practice and even place an individual's digital reputation in jeopardy. Another aspect of this new situation is that the Internet produces information saturation, and the appropriate use of certain tools can help to improve our productivity and prevent such information overload or infoxication. Copyright © 2015 AEDV. Published by Elsevier España, S.L.U. All rights reserved.

  15. Automated Parallel Computing Tools for Multicore Machines and Clusters Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to improve productivity of high performance computing for applications on multicore computers and clusters. These machines built from one or more chips...

  16. Computational study of noise in a large signal transduction network

    Directory of Open Access Journals (Sweden)

    Ruohonen Keijo

    2011-06-01

    Full Text Available Abstract Background Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. Results We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. Conclusions We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies.

  17. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    In modernity, an individual identity was constituted from civil society, while in a globalized network society, human identity, if it develops at all, must grow from communal resistance. A communal resistance to an abstract conceptualized world, where there is no possibility for perception...... their lives in a diversity of social and cultural contexts. In so doing, the book tries to imagine in what kind of networks humans may choose and act based on the knowledge and empirical evidence presented in the papers. The topics covered in the book include: people and their changing values; citizens...... in a network society; the individual and knowledge-based organizations; human responsibility and technology; and exclusion and regeneration. This volume contains the edited proceedings of the Fifth World Conference on Human Choice and Computers (HCC-5), which was sponsored by the International Federation...

  18. Computer simulation of randomly cross-linked polymer networks

    CERN Document Server

    Williams, T P

    2002-01-01

    In this work, Monte Carlo and Stochastic Dynamics computer simulations of mesoscale model randomly cross-linked networks were undertaken. Task parallel implementations of the lattice Monte Carlo Bond Fluctuation model and Kremer-Grest Stochastic Dynamics bead-spring continuum model were designed and used for this purpose. Lattice and continuum precursor melt systems were prepared and then cross-linked to varying degrees. The resultant networks were used to study structural changes during deformation and relaxation dynamics. The effects of a random network topology featuring a polydisperse distribution of strand lengths and an abundance of pendant chain ends, were qualitatively compared to recent published work. A preliminary investigation into the effects of temperature on the structural and dynamical properties was also undertaken. Structural changes during isotropic swelling and uniaxial deformation, revealed a pronounced non-affine deformation dependant on the degree of cross-linking. Fractal heterogeneiti...

  19. An Optimal Path Computation Architecture for the Cloud-Network on Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Hyunhun Cho

    2015-05-01

    Full Text Available Legacy networks do not open the precise information of the network domain because of scalability, management and commercial reasons, and it is very hard to compute an optimal path to the destination. According to today’s ICT environment change, in order to meet the new network requirements, the concept of software-defined networking (SDN has been developed as a technological alternative to overcome the limitations of the legacy network structure and to introduce innovative concepts. The purpose of this paper is to propose the application that calculates the optimal paths for general data transmission and real-time audio/video transmission, which consist of the major services of the National Research & Education Network (NREN in the SDN environment. The proposed SDN routing computation (SRC application is designed and applied in a multi-domain network for the efficient use of resources, selection of the optimal path between the multi-domains and optimal establishment of end-to-end connections.

  20. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    Science.gov (United States)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  1. Computational Modeling of Single Neuron Extracellular Electric Potentials and Network Local Field Potentials using LFPsim.

    Science.gov (United States)

    Parasuram, Harilal; Nair, Bipin; D'Angelo, Egidio; Hines, Michael; Naldi, Giovanni; Diwakar, Shyam

    2016-01-01

    Local Field Potentials (LFPs) are population signals generated by complex spatiotemporal interaction of current sources and dipoles. Mathematical computations of LFPs allow the study of circuit functions and dysfunctions via simulations. This paper introduces LFPsim, a NEURON-based tool for computing population LFP activity and single neuron extracellular potentials. LFPsim was developed to be used on existing cable compartmental neuron and network models. Point source, line source, and RC based filter approximations can be used to compute extracellular activity. As a demonstration of efficient implementation, we showcase LFPs from mathematical models of electrotonically compact cerebellum granule neurons and morphologically complex neurons of the neocortical column. LFPsim reproduced neocortical LFP at 8, 32, and 56 Hz via current injection, in vitro post-synaptic N2a, N2b waves and in vivo T-C waves in cerebellum granular layer. LFPsim also includes a simulation of multi-electrode array of LFPs in network populations to aid computational inference between biophysical activity in neural networks and corresponding multi-unit activity resulting in extracellular and evoked LFP signals.

  2. Enhancing Lifelong Competence Development and Management Systems with Social Network-based Concepts and Tools

    NARCIS (Netherlands)

    Cheak, Alicia; Angehrn, Albert; Sloep, Peter

    2006-01-01

    This paper addresses the challenge of enhancing the social dimension of lifelong Competence Development and Management Systems with social network-based concepts and tools. Our premise is that through a combination of social network visualization tools, simulations, stimulus agents and management

  3. Application of artificial neural networks in computer-aided diagnosis.

    Science.gov (United States)

    Liu, Bei

    2015-01-01

    Computer-aided diagnosis is a diagnostic procedure in which a radiologist uses the outputs of computer analysis of medical images as a second opinion in the interpretation of medical images, either to help with lesion detection or to help determine if the lesion is benign or malignant. Artificial neural networks (ANNs) are usually employed to formulate the statistical models for computer analysis. Receiver operating characteristic curves are used to evaluate the performance of the ANN alone, as well as the diagnostic performance of radiologists who take into account the ANN output as a second opinion. In this chapter, we use mammograms to illustrate how an ANN model is trained, tested, and evaluated, and how a radiologist should use the ANN output as a second opinion in CAD.

  4. Professors' and students' perceptions and experiences of computational simulations as learning tools

    Science.gov (United States)

    Magana de Leon, Alejandra De Jesus

    Computational simulations are becoming a critical component of scientific and engineering research, and now are becoming an important component for learning. This dissertation provides findings from a multifaceted research study exploring the ways computational simulations have been perceived and experienced as learning tools by instructors and students. Three studies were designed with an increasing focus on the aspects of learning and instructing with computational simulation tools. Study One used a student survey with undergraduate and graduate students whose instructors enhanced their teaching using online computational tools. Results of this survey were used to identify students' perceptions and experiences with these simulations as learning tools. The results provided both an evaluation of the instructional design and an indicator of which instructors were selected in Study Two. Study Two used a phenomenographic research design resulting in a two dimensional outcome space with six qualitatively different ways instructors perceived their learning outcomes associated with using simulation tools as part of students' learning experiences. Results from this work provide a framework for identifying major learning objectives to promote learning with computational simulation tools. Study Three used a grounded theory methodology to expand on instructors' learning objectives to include their perceptions of formative assessment and pedagogy. These perceptions were compared and contrasted with students' perceptions associated with learning with computational tools. The study is organized around three phases and analyzed as a collection of case studies focused on the instructors and their students' perceptions and experiences of computational simulations as learning tools. This third study resulted in a model for using computational simulations as learning tools. This model indicates the potential of integrating the computational simulation tools into formal learning

  5. A Visual Tool for Computer Supported Learning: The Robot Motion Planning Example

    Science.gov (United States)

    Elnagar, Ashraf; Lulu, Leena

    2007-01-01

    We introduce an effective computer aided learning visual tool (CALVT) to teach graph-based applications. We present the robot motion planning problem as an example of such applications. The proposed tool can be used to simulate and/or further to implement practical systems in different areas of computer science such as graphics, computational…

  6. Computer simulation models as a tool to investigate the role of microRNAs in osteoarthritis.

    Directory of Open Access Journals (Sweden)

    Carole J Proctor

    Full Text Available The aim of this study was to show how computational models can be used to increase our understanding of the role of microRNAs in osteoarthritis (OA using miR-140 as an example. Bioinformatics analysis and experimental results from the literature were used to create and calibrate models of gene regulatory networks in OA involving miR-140 along with key regulators such as NF-κB, SMAD3, and RUNX2. The individual models were created with the modelling standard, Systems Biology Markup Language, and integrated to examine the overall effect of miR-140 on cartilage homeostasis. Down-regulation of miR-140 may have either detrimental or protective effects for cartilage, indicating that the role of miR-140 is complex. Studies of individual networks in isolation may therefore lead to different conclusions. This indicated the need to combine the five chosen individual networks involving miR-140 into an integrated model. This model suggests that the overall effect of miR-140 is to change the response to an IL-1 stimulus from a prolonged increase in matrix degrading enzymes to a pulse-like response so that cartilage degradation is temporary. Our current model can easily be modified and extended as more experimental data become available about the role of miR-140 in OA. In addition, networks of other microRNAs that are important in OA could be incorporated. A fully integrated model could not only aid our understanding of the mechanisms of microRNAs in ageing cartilage but could also provide a useful tool to investigate the effect of potential interventions to prevent cartilage loss.

  7. Gear cutting tools fundamentals of design and computation

    CERN Document Server

    Radzevich, Stephen P

    2010-01-01

    Presents the DG/K-based method of surface generation, a novel and practical mathematical method for designing gear cutting tools with optimal parameters. This book proposes a scientific classification for the various kinds of the gear machining meshes, discussing optimal designs of gear cutting tools.

  8. New computer architectures as tools for ecological thought.

    Science.gov (United States)

    Villa, F

    1992-06-01

    Recent achievements of computer science provide unrivaled power for the advancement of ecology. This power is not merely computational: parallel computers, having hierarchical organization as their architectural principle, also provide metaphors for understanding complex systems. In this sense they might play for a science of ecological complexity a role like equilibrium-based metaphors had in the development of dynamic systems ecology. Parallel computers provide this opportunity through an informational view of ecological reality and multilevel modelling paradigms. Spatial and individual-oriented models allow application and full understanding of the new metaphors in the ecological context. Copyright © 1992. Published by Elsevier Ltd.

  9. Building Model for the University of Mosul Computer Network Using OPNET Simulator

    Directory of Open Access Journals (Sweden)

    Modhar Modhar A. Hammoudi

    2013-04-01

    Full Text Available This paper aims at establishing a model in OPNET (Optimized Network Engineering Tool simulator for the University of Mosul computer network. The proposed network model was made up of two routers (Cisco 2600, core switch (Cisco6509, two servers, ip 32 cloud and 37 VLANs. These VLANs were connected to the core switch using fiber optic cables (1000BaseX. Three applications were added to test the network model. These applications were FTP (File Transfer Protocol, HTTP (Hyper Text Transfer Protocol and VoIP (Voice over Internet Protocol. The results showed that the proposed model had a positive efficiency on designing and managing the targeted network and can be used to view the data flow in it. Also, the simulation results showed that the maximum number of VoIP service users could be raised upto 5000 users when working under IP Telephony. This means that the ability to utilize VoIP service in this network can be maintained and is better when subjected to IP telephony scheme.

  10. Open Problems in Network-aware Data Management in Exa-scale Computing and Terabit Networking Era

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet; Byna, Surendra

    2011-12-06

    Accessing and managing large amounts of data is a great challenge in collaborative computing environments where resources and users are geographically distributed. Recent advances in network technology led to next-generation high-performance networks, allowing high-bandwidth connectivity. Efficient use of the network infrastructure is necessary in order to address the increasing data and compute requirements of large-scale applications. We discuss several open problems, evaluate emerging trends, and articulate our perspectives in network-aware data management.

  11. Line-plane broadcasting in a data communications network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Berg, Jeremy E.; Blocksome, Michael A.; Smith, Brian E.

    2010-06-08

    Methods, apparatus, and products are disclosed for line-plane broadcasting in a data communications network of a parallel computer, the parallel computer comprising a plurality of compute nodes connected together through the network, the network optimized for point to point data communications and characterized by at least a first dimension, a second dimension, and a third dimension, that include: initiating, by a broadcasting compute node, a broadcast operation, including sending a message to all of the compute nodes along an axis of the first dimension for the network; sending, by each compute node along the axis of the first dimension, the message to all of the compute nodes along an axis of the second dimension for the network; and sending, by each compute node along the axis of the second dimension, the message to all of the compute nodes along an axis of the third dimension for the network.

  12. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    Science.gov (United States)

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  13. Computer Art--A New Tool in Advertising Graphics.

    Science.gov (United States)

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  14. The Computer: A Tool for Writing with LEP Students.

    Science.gov (United States)

    Johnson, Mary

    One way to help minority students with limited English proficiency (LEP) achieve a higher level of literacy is to use computers in language arts classes. Word processors now enable the production of software that involves students in text production and manipulation. This development has made it possible to introduce the computer into the…

  15. Data Visualization and Analysis Tools for the Global Precipitation Measurement (GPM) Validation Network

    Science.gov (United States)

    Morris, Kenneth R.; Schwaller, Mathew

    2010-01-01

    The Validation Network (VN) prototype for the Global Precipitation Measurement (GPM) Mission compares data from the Tropical Rainfall Measuring Mission (TRMM) satellite Precipitation Radar (PR) to similar measurements from U.S. and international operational weather radars. This prototype is a major component of the GPM Ground Validation System (GVS). The VN provides a means for the precipitation measurement community to identify and resolve significant discrepancies between the ground radar (GR) observations and similar satellite observations. The VN prototype is based on research results and computer code described by Anagnostou et al. (2001), Bolen and Chandrasekar (2000), and Liao et al. (2001), and has previously been described by Morris, et al. (2007). Morris and Schwaller (2009) describe the PR-GR volume-matching algorithm used to create the VN match-up data set used for the comparisons. This paper describes software tools that have been developed for visualization and statistical analysis of the original and volume matched PR and GR data.

  16. An efficient algorithm for computing fixed length attractors based on bounded model checking in synchronous Boolean networks with biochemical applications.

    Science.gov (United States)

    Li, X Y; Yang, G W; Zheng, D S; Guo, W S; Hung, W N N

    2015-04-28

    Genetic regulatory networks are the key to understanding biochemical systems. One condition of the genetic regulatory network under different living environments can be modeled as a synchronous Boolean network. The attractors of these Boolean networks will help biologists to identify determinant and stable factors. Existing methods identify attractors based on a random initial state or the entire state simultaneously. They cannot identify the fixed length attractors directly. The complexity of including time increases exponentially with respect to the attractor number and length of attractors. This study used the bounded model checking to quickly locate fixed length attractors. Based on the SAT solver, we propose a new algorithm for efficiently computing the fixed length attractors, which is more suitable for large Boolean networks and numerous attractors' networks. After comparison using the tool BooleNet, empirical experiments involving biochemical systems demonstrated the feasibility and efficiency of our approach.

  17. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles [Argonne National Lab. (ANL), Argonne, IL (United States); Bell, Greg [ESnet, Berkeley, CA (United States); Canon, Shane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [ESnet, Berkeley, CA (United States); Dattoria, Vince [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Goodwin, Dave [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Lee, Jason [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hicks, Susan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Holohan, Ed [Argonne National Lab. (ANL), Argonne, IL (United States); Klasky, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lauzon, Carolyn [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Rogers, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Skinner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tierney, Brian [ESnet, Berkeley, CA (United States)

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  18. An integrated geometric modelling framework for patient-specific computational haemodynamic study on wide-ranged vascular network.

    Science.gov (United States)

    Torii, Ryo; Oshima, Marie

    2012-01-01

    Patient-specific haemodynamic computations have been used as an effective tool in researches on cardiovascular disease associated with haemodynamics such as atherosclerosis and aneurysm. Recent development of computer resource has enabled 3D haemodynamic computations in wide-spread arterial network but there are still difficulties in modelling vascular geometry because of noise and limited resolution in medical images. In this paper, an integrated framework to model an arterial network tree for patient-specific computational haemodynamic study is developed. With this framework, 3D vascular geometry reconstruction of an arterial network and quantification of its geometric feature are aimed. The combination of 3D haemodynamic computation and vascular morphology quantification helps better understand the relationship between vascular morphology and haemodynamic force behind 'geometric risk factor' for cardiovascular diseases. The proposed method is applied to an intracranial arterial network to demonstrate its accuracy and effectiveness. The results are compared with the marching-cubes (MC) method. The comparison shows that the present modelling method can reconstruct a wide-ranged vascular network anatomically more accurate than the MC method, particularly in peripheral circulation where the image resolution is low in comparison to the vessel diameter, because of the recognition of an arterial network connectivity based on its centreline.

  19. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    Science.gov (United States)

    2015-07-14

    AFRL-OSR-VA-TR-2015-0176 A dedicated computational platform for Cellular Monte Carlo T- CAD software tools Marco Saraniti ARIZONA STATE UNIVERSITY...TITLE AND SUBTITLE A Dedicated Computational Platform for Cellular Monte Carlo T- CAD Software Tools 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-14...with an optimized architecture for the Cellular Monte Carlo particle-based T- CAD simulation tools developed by our group. Such code is used for the

  20. MODELING COMPARATIVE THERMAL PERFORMANCE OF LIGHTWEIGHT FABRICS USING A COMPUTATIONAL DESIGN TOOL

    Science.gov (United States)

    2017-04-14

    M., Bieszczad, J., Gagne, J., Fogg, D., Fan, J., “ Design Tool for Clothing Applications: Wind Resistant Fabric Layers and Permeable Vents,” Journal ...THERMAL PERFORMANCE OF LIGHTWEIGHT FABRICS USING A COMPUTATIONAL DESIGN TOOL by Judith Sennett and Phillip Gibson April 2017...MODELING COMPARATIVE THERMAL PERFORMANCE OF LIGHTWEIGHT FABRICS USING A COMPUTATIONAL DESIGN TOOL 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  1. "Development Radar": The Co-Configuration of a Tool in a Learning Network

    Science.gov (United States)

    Toiviainen, Hanna; Kerosuo, Hannele; Syrjala, Tuula

    2009-01-01

    Purpose: The paper aims to argue that new tools are needed for operating, developing and learning in work-life networks where academic and practice knowledge are intertwined in multiple levels of and in boundary-crossing across activities. At best, tools for learning are designed in a process of co-configuration, as the analysis of one tool,…

  2. The Utility of Computer Tracking Tools for User-Centered Design.

    Science.gov (United States)

    Gay, Geri; Mazur, Joan

    1993-01-01

    Describes tracking tools used by designers and users to evaluate the efficacy of hypermedia systems. Highlights include human-computer interaction research; tracking tools and user-centered design; and three examples from the Interactive Multimedia Group at Cornell University that illustrate uses of various tracking tools. (27 references) (LRW)

  3. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  4. iTools: a framework for classification, categorization and integration of computational biology resources.

    Science.gov (United States)

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  5. Network Security Hacks Tips & Tools for Protecting Your Privacy

    CERN Document Server

    Lockhart, Andrew

    2009-01-01

    This second edition of Network Security Hacks offers 125 concise and practical hacks, including more information for Windows administrators, hacks for wireless networking (such as setting up a captive portal and securing against rogue hotspots), and techniques to ensure privacy and anonymity, including ways to evade network traffic analysis, encrypt email and files, and protect against phishing attacks. System administrators looking for reliable answers will also find concise examples of applied encryption, intrusion detection, logging, trending, and incident response.

  6. Development, Exploitation, and Transition of Computer Aided Engineering (CAE) Tools

    National Research Council Canada - National Science Library

    Carter, Harold W

    2003-01-01

    .... Tasks include CMOS-based microwave component design and fabrication, parallel and mixed-signal VHDL and VBHDL-AMS simulator algorithms, conversion tools for VHDL-AMS models, System-on-a-Chip methods...

  7. Computational tools and algorithms for designing customized synthetic genes

    National Research Council Canada - National Science Library

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    ... that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties...

  8. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    Energy Technology Data Exchange (ETDEWEB)

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  9. EFL Learners' Attitudes towards Using Computers as a Learning Tool in Language Learning

    Science.gov (United States)

    Kitchakarn, Orachorn

    2015-01-01

    The study was conducted to investigate attitudes toward using computers as a learning tool among undergraduate students in a private university. In this regards, some variables which might be potential antecedents of attitudes toward computer including gender, experience of using computers and perceived abilities in using programs were examined.…

  10. [Forensic evidence-based medicine in computer communication networks].

    Science.gov (United States)

    Qiu, Yun-Liang; Peng, Ming-Qi

    2013-12-01

    As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.

  11. Computational analysis of protein interaction networks for infectious diseases.

    Science.gov (United States)

    Pan, Archana; Lahiri, Chandrajit; Rajendiran, Anjana; Shanmugham, Buvaneswari

    2016-05-01

    Infectious diseases caused by pathogens, including viruses, bacteria and parasites, pose a serious threat to human health worldwide. Frequent changes in the pattern of infection mechanisms and the emergence of multidrug-resistant strains among pathogens have weakened the current treatment regimen. This necessitates the development of new therapeutic interventions to prevent and control such diseases. To cater to the need, analysis of protein interaction networks (PINs) has gained importance as one of the promising strategies. The present review aims to discuss various computational approaches to analyse the PINs in context to infectious diseases. Topology and modularity analysis of the network with their biological relevance, and the scenario till date about host-pathogen and intra-pathogenic protein interaction studies were delineated. This would provide useful insights to the research community, thereby enabling them to design novel biomedicine against such infectious diseases. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  12. Reducing Computational Overhead of Network Coding with Intrinsic Information Conveying

    DEFF Research Database (Denmark)

    Heide, Janus; Zhang, Qi; Pedersen, Morten V.

    This paper investigated the possibility of intrinsic information conveying in network coding systems. The information is embedded into the coding vector by constructing the vector based on a set of predefined rules. This information can subsequently be retrieved by any receiver. The starting point...... is RLNC (Random Linear Network Coding) and the goal is to reduce the amount of coding operations both at the coding and decoding node, and at the same time remove the need for dedicated signaling messages. In a traditional RLNC system, coding operation takes up significant computational resources and adds...... to the overall energy consumption, which is particular problematic for mobile battery-driven devices. In RLNC coding is performed over a FF (Finite Field). We propose to divide this field into sub fields, and let each sub field signify some information or state. In order to embed the information correctly...

  13. Risks and benefits of social computing as a healthcare tool

    CSIR Research Space (South Africa)

    Mxoli, Avuya

    2016-03-01

    Full Text Available Cybercitizen describes a frequent user of the Internet or in other terms, a member of an online community (cybercommunity). This digital space can be used to participate in educational, economical and cultural activities. Social computing...

  14. Symbolic dynamics and computation in model gene networks.

    Science.gov (United States)

    Edwards, R.; Siegelmann, H. T.; Aziza, K.; Glass, L.

    2001-03-01

    We analyze a class of ordinary differential equations representing a simplified model of a genetic network. In this network, the model genes control the production rates of other genes by a logical function. The dynamics in these equations are represented by a directed graph on an n-dimensional hypercube (n-cube) in which each edge is directed in a unique orientation. The vertices of the n-cube correspond to orthants of state space, and the edges correspond to boundaries between adjacent orthants. The dynamics in these equations can be represented symbolically. Starting from a point on the boundary between neighboring orthants, the equation is integrated until the boundary is crossed for a second time. Each different cycle, corresponding to a different sequence of orthants that are traversed during the integration of the equation always starting on a boundary and ending the first time that same boundary is reached, generates a different letter of the alphabet. A word consists of a sequence of letters corresponding to a possible sequence of orthants that arise from integration of the equation starting and ending on the same boundary. The union of the words defines the language. Letters and words correspond to analytically computable Poincare maps of the equation. This formalism allows us to define bifurcations of chaotic dynamics of the differential equation that correspond to changes in the associated language. Qualitative knowledge about the dynamics found by integrating the equation can be used to help solve the inverse problem of determining the underlying network generating the dynamics. This work places the study of dynamics in genetic networks in a context comprising both nonlinear dynamics and the theory of computation. (c) 2001 American Institute of Physics.

  15. Prospectives to tractor cabin design with computational acoustics tools

    OpenAIRE

    Mönkölä, Sanna; Airaksinen, Tuomas; Makkonen, Pekka; Tuovinen, Tero; Neittaanmäki, Pekka

    2011-01-01

    Computational acoustical models allow automated optimization of tractor design with respect to acoustic properties, which could speed up significantly the design process of tractor cabin prototypes. This article gives insightful prospectives to the tractor design process by considering modern computational acoustics technology. Mathematical formulation for a system consisting of vibrating elastic tractor structure and airfilled acoustic enclosure are given and a related numerical solution tec...

  16. Triadic Scaffolds: Tools for Teaching English Language Learners with Computers

    Directory of Open Access Journals (Sweden)

    Carla Meskill

    2005-01-01

    Full Text Available Active communication with others is key to human learning. This straightforward premise currently undergirds much theory and research in student learning in general, and in second language and literacy learning in particular. Both of these academic areas have long acknowledged communication's central role in successful learning with the exact intricacies of instructional conversations and the forms these take having been the focus of close analysis (Cazden, 1988; Gee, 2001; Nystrand, Gamoran, Kachur, & Prendergast, 1997; Tharp & Galimore, 1991; van Lier, 2000. In this examination of computer-supported classroom discourse, specific forms of instructional conversation employed by a veteran elementary teacher of beginning-level English language learners (ELLs are examined. The focal teacher orchestrates instructional conversations around computers with children whose immediate needs are to learn the English language, specifically the "language of school" and the concomitant social complexities implied in order to participate in mainstream instructional activity. With these goals shaping language and literacy activity, their ESOL (English for speakers of other languages teacher makes use of the computer to capture, motivate, and anchor learner attention to, and render comprehensible the target language they hear and see on and around the computer screen. The anatomy of the activity she orchestrates around the computer and the language she uses to support it -- labeled here as triadic scaffolds -- are the focus of analysis. Forms and functions of triadic discourse (teacher, learner, computer are examined for their potential unique role in second language and literacy instruction.

  17. Neural networks as a tool for unit commitment

    DEFF Research Database (Denmark)

    Rønne-Hansen, Peter; Rønne-Hansen, Jan

    1991-01-01

    Some of the fundamental problems when solving the power system unit commitment problem by means of neural networks have been attacked. It has been demonstrated for a small example that neural networks might be a viable alternative. Some of the major problems solved in this initiating phase form...

  18. Topological and Geometric Tools for the Analysis fo Complex Networks

    Science.gov (United States)

    2013-10-01

    faster than the current subgradient techniques for network optimization. Below, we will highlight some of the major advances. Some highlights are...Maximization Most existing work uses dual decomposition and subgradient methods to solve network optimization problems in a distributed manner, which...neighborhood. Simulation results illustrate the significant multiple order of magnitude performance gains of this method relative to subgradient methods

  19. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    Finding a location for a new facility such that the facility attracts the maximal number of customers is a challenging problem. Existing studies either model customers as static sites and thus do not consider customer movement, or they focus on theoretical aspects and do not provide solutions tha...... that adopt different approaches to computing the query. Algorithm AUG uses graph augmentation, and ITE uses iterative road-network partitioning. Empirical studies with real data sets demonstrate that the algorithms are capable of offering high performance in realistic settings....

  20. Auditory Display as a Tool for Teaching Network Intrusion Detection

    Directory of Open Access Journals (Sweden)

    M.A. Garcia-Ruiz

    2008-06-01

    Full Text Available Teaching network intrusion detection, or NID(the identification of violations of a security policy in acomputer network is a challenging task, because studentsneed to analyze many data from network logs and in realtime to identify patterns of network attacks, making theseactivities visually tiring. This paper describes an ongoingresearch concerned with designing and applying sounds thatrepresent meaningful information in interfaces(sonification to support teaching of NID. An usability testwas conducted with engineering students. Natural soundeffects (auditory icons and musical sounds (earcons wereused to represent network attacks. A post-activityquestionnaire showed that most students preferred auditoryicons for analyzing NID, and all of them were veryinterested in the design and application of sonifications.

  1. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    Science.gov (United States)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  2. Utah's Regional/Urban ANSS Seismic Network---Strategies and Tools for Quality Performance

    Science.gov (United States)

    Burlacu, R.; Arabasz, W. J.; Pankow, K. L.; Pechmann, J. C.; Drobeck, D. L.; Moeinvaziri, A.; Roberson, P. M.; Rusho, J. A.

    2007-05-01

    The University of Utah's regional/urban seismic network (224 stations recorded: 39 broadband, 87 strong-motion, 98 short-period) has become a model for locally implementing the Advanced National Seismic System (ANSS) because of successes in integrating weak- and strong-motion recording and in developing an effective real-time earthquake information system. Early achievements included implementing ShakeMap, ShakeCast, point-to- multipoint digital telemetry, and an Earthworm Oracle database, as well as in-situ calibration of all broadband and strong-motion stations and submission of all data and metadata into the IRIS DMC. Regarding quality performance, our experience as a medium-size regional network affirms the fundamental importance of basics such as the following: for data acquisition, deliberate attention to high-quality field installations, signal quality, and computer operations; for operational efficiency, a consistent focus on professional project management and human resources; and for customer service, healthy partnerships---including constant interactions with emergency managers, engineers, public policy-makers, and other stakeholders as part of an effective state earthquake program. (Operational cost efficiencies almost invariably involve trade-offs between personnel costs and the quality of hardware and software.) Software tools that we currently rely on for quality performance include those developed by UUSS (e.g., SAC and shell scripts for estimating local magnitudes) and software developed by other organizations such as: USGS (Earthworm), University of Washington (interactive analysis software), ISTI (SeisNetWatch), and IRIS (PDCC, BUD tools). Although there are many pieces, there is little integration. One of the main challenges we face is the availability of a complete and coherent set of tools for automatic and post-processing to assist in achieving the goals/requirements set forth by ANSS. Taking our own network---and ANSS---to the next level

  3. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Directory of Open Access Journals (Sweden)

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  4. NML Computation Algorithms for Tree-Structured Multinomial Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Kontkanen Petri

    2007-01-01

    Full Text Available Typical problems in bioinformatics involve large discrete datasets. Therefore, in order to apply statistical methods in such domains, it is important to develop efficient algorithms suitable for discrete data. The minimum description length (MDL principle is a theoretically well-founded, general framework for performing statistical inference. The mathematical formalization of MDL is based on the normalized maximum likelihood (NML distribution, which has several desirable theoretical properties. In the case of discrete data, straightforward computation of the NML distribution requires exponential time with respect to the sample size, since the definition involves a sum over all the possible data samples of a fixed size. In this paper, we first review some existing algorithms for efficient NML computation in the case of multinomial and naive Bayes model families. Then we proceed by extending these algorithms to more complex, tree-structured Bayesian networks.

  5. Evaluating tablet computers as a survey tool in rural communities.

    Science.gov (United States)

    Newell, Steve M; Logan, Henrietta L; Guo, Yi; Marks, John G; Shepperd, James A

    2015-01-01

    Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants' responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida's state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants' usability ratings. Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  6. P2PStudio - Monitoring, Controlling and Visualization Tool for Peer-to-Peer Networks Research

    OpenAIRE

    Kotilainen, Niko; Vapa, Mikko; Auvinen, Annemari; Weber, Matthieu; Vuori, Jarkko

    2006-01-01

    Peer-to-Peer Studio has been developed as a monitoring, controlling and visualization tool for peer-to-peer networks. It uses a centralized architecture to gather events from a peer-to-peer network and can be used to visualize network topology and to send different commands to individual peer-to-peer nodes. The tool has been used with Chedar Peer-to-Peer network to study the behavior of different peer-to-peer resource discovery and topology management algorithms and for visualizing the result...

  7. A scalable computational framework for establishing long-term behavior of stochastic reaction networks.

    Directory of Open Access Journals (Sweden)

    Ankit Gupta

    2014-06-01

    Full Text Available Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed.

  8. A scalable computational framework for establishing long-term behavior of stochastic reaction networks.

    Science.gov (United States)

    Gupta, Ankit; Briat, Corentin; Khammash, Mustafa

    2014-06-01

    Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed.

  9. A Scalable Computational Framework for Establishing Long-Term Behavior of Stochastic Reaction Networks

    Science.gov (United States)

    Khammash, Mustafa

    2014-01-01

    Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed. PMID:24968191

  10. Combining MLP and Using Decision Tree in Order to Detect the Intrusion into Computer Networks

    OpenAIRE

    Saba Sedigh Rad; Alireza Zebarjad

    2013-01-01

    The security of computer networks has an important role in computer systems. The increasing use of computer networks results in penetration and destruction of systems by system operations. So, in order to keep the systems away from these hazards, it is essential to use the intrusion detection system (IDS). This intrusion detection is done in order to detect the illicit use and misuse and to avoid damages to the systems and computer networks by both the external and internal intruders. Intrusi...

  11. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation; 4) model transfer – export/import to/from other application for further extension and application – several types of formats, such as XML......Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates...

  12. Safe manning of merchant ships: an approach and computer tool

    DEFF Research Database (Denmark)

    Alapetite, Alexandre; Kozin, Igor

    2017-01-01

    -event simulation and allows estimation of the workload and of whether different scenarios are successfully performed taking account of the number of crewmembers, watch schedules, distribution of competencies, and others. The software library ‘SimManning’ at the core of the project is provided as open source......In the shipping industry, staffing expenses have become a vital competition parameter. In this paper, an approach and a software tool are presented to support decisions on the staffing of merchant ships. The tool is implemented in the form of a Web user interface that makes use of discrete...

  13. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2014-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  14. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2013-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  15. Computer Cartography--A New Tool for Institutional Planning.

    Science.gov (United States)

    Creswell, John W.; Self, Burl E.

    This paper discusses the Synagraphic Mapping System (SYMAP), a computerized cartographic planning tool, and describes its use by a large metropolitan junior college. The authors offer a brief introductory description of SYMAP and then discuss three possible uses of the system, illustrating their discussion with sample SYMAP-generated map displays.…

  16. Understanding Computation of Impulse Response in Microwave Software Tools

    Science.gov (United States)

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  17. Enhancing interest in statistics among computer science students using computer tool entrepreneur role play

    Science.gov (United States)

    Judi, Hairulliza Mohamad; Sahari @ Ashari, Noraidah; Eksan, Zanaton Hj

    2017-04-01

    Previous research in Malaysia indicates that there is a problem regarding attitude towards statistics among students. They didn't show positive attitude in affective, cognitive, capability, value, interest and effort aspects although did well in difficulty. This issue should be given substantial attention because students' attitude towards statistics may give impacts on the teaching and learning process of the subject. Teaching statistics using role play is an appropriate attempt to improve attitudes to statistics, to enhance the learning of statistical techniques and statistical thinking, and to increase generic skills. The objectives of the paper are to give an overview on role play in statistics learning and to access the effect of these activities on students' attitude and learning in action research framework. The computer tool entrepreneur role play is conducted in a two-hour tutorial class session of first year students in Faculty of Information Sciences and Technology (FTSM), Universiti Kebangsaan Malaysia, enrolled in Probability and Statistics course. The results show that most students feel that they have enjoyable and great time in the role play. Furthermore, benefits and disadvantages from role play activities were highlighted to complete the review. Role play is expected to serve as an important activities that take into account students' experience, emotions and responses to provide useful information on how to modify student's thinking or behavior to improve learning.

  18. 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    2015-01-01

    This edited book presents scientific results of 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2014) held on June 30 – July 2, 2014 in Las Vegas Nevada, USA. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 13 outstanding papers from those papers accepted for presentation at the conference.

  19. 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing

    CERN Document Server

    Studies in Computational Intelligence : Volume 492

    2013-01-01

    This edited book presents scientific results of the 14th ACIS/IEEE International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD 2013), held in Honolulu, Hawaii, USA on July 1-3, 2013. The aim of this conference was to bring together scientists, engineers, computer users, and students to share their experiences and exchange new ideas, research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the 17 outstanding papers from those papers accepted for presentation at the conference.  

  20. STARNET 2: a web-based tool for accelerating discovery of gene regulatory networks using microarray co-expression data

    Directory of Open Access Journals (Sweden)

    VanBuren Vincent

    2009-10-01

    Full Text Available Abstract Background Although expression microarrays have become a standard tool used by biologists, analysis of data produced by microarray experiments may still present challenges. Comparison of data from different platforms, organisms, and labs may involve complicated data processing, and inferring relationships between genes remains difficult. Results STARNET 2 is a new web-based tool that allows post hoc visual analysis of correlations that are derived from expression microarray data. STARNET 2 facilitates user discovery of putative gene regulatory networks in a variety of species (human, rat, mouse, chicken, zebrafish, Drosophila, C. elegans, S. cerevisiae, Arabidopsis and rice by graphing networks of genes that are closely co-expressed across a large heterogeneous set of preselected microarray experiments. For each of the represented organisms, raw microarray data were retrieved from NCBI's Gene Expression Omnibus for a selected Affymetrix platform. All pairwise Pearson correlation coefficients were computed for expression profiles measured on each platform, respectively. These precompiled results were stored in a MySQL database, and supplemented by additional data retrieved from NCBI. A web-based tool allows user-specified queries of the database, centered at a gene of interest. The result of a query includes graphs of correlation networks, graphs of known interactions involving genes and gene products that are present in the correlation networks, and initial statistical analyses. Two analyses may be performed in parallel to compare networks, which is facilitated by the new HEATSEEKER module. Conclusion STARNET 2 is a useful tool for developing new hypotheses about regulatory relationships between genes and gene products, and has coverage for 10 species. Interpretation of the correlation networks is supported with a database of previously documented interactions, a test for enrichment of Gene Ontology terms, and heat maps of correlation

  1. Implications of computer networking and the Internet for nurse education.

    Science.gov (United States)

    Ward, R

    1997-06-01

    This paper sets out the history of computer networking and its use in nursing and health care education, and places this in its wider historical and social context. The increasing availability and use of computer networks and the internet are producing a changing climate in education as well as in health care. Moves away from traditional face-to-face teaching with a campus institution to widely distributed interactive multimedia learning will affect the roles of students and teachers. The use of electronic mail, mailing lists and the World Wide Web are specifically considered, along with changes to library and information management skills, research methods, journal publication and the like. Issues about the quality, as well as quantity, of information available, are considered. As more and more organizations and institutions begin to use electronic communication methods, it becomes an increasingly important part of the curriculum at all levels, and may lead to fundamental changes in geographical and professional boundaries. A glossary of terms is provided for those not familiar with the technology, along with the contact details for mailing lists and World Wide Web pages mentioned.

  2. Neural Networks as a Tool for Georadar Data Processing

    Directory of Open Access Journals (Sweden)

    Szymczyk Piotr

    2015-12-01

    Full Text Available In this article a new neural network based method for automatic classification of ground penetrating radar (GPR traces is proposed. The presented approach is based on a new representation of GPR signals by polynomials approximation. The coefficients of the polynomial (the feature vector are neural network inputs for automatic classification of a special kind of geologic structure—a sinkhole. The analysis and results show that the classifier can effectively distinguish sinkholes from other geologic structures.

  3. Social networking as an advertising tool in Russia and abroad

    Directory of Open Access Journals (Sweden)

    Ageeva Y. A.

    2016-01-01

    Full Text Available This study contrasts the behavioural patterns of users on Facebook with those on VKontakte using data collected by Facebook and a survey of Russian VKontakte users. The authors analyse the key differences between the two popular social networks, including what users perceived to be the most attractive options, the amount of time spent online and attitudes toward advertising. The results have been used to evaluate the potential of social networks (SMM for business promotion in Russia.

  4. Optimization of stochastic discrete systems and control on complex networks computational networks

    CERN Document Server

    Lozovanu, Dmitrii

    2014-01-01

    This book presents the latest findings on stochastic dynamic programming models and on solving optimal control problems in networks. It includes the authors' new findings on determining the optimal solution of discrete optimal control problems in networks and on solving game variants of Markov decision problems in the context of computational networks. First, the book studies the finite state space of Markov processes and reviews the existing methods and algorithms for determining the main characteristics in Markov chains, before proposing new approaches based on dynamic programming and combinatorial methods. Chapter two is dedicated to infinite horizon stochastic discrete optimal control models and Markov decision problems with average and expected total discounted optimization criteria, while Chapter three develops a special game-theoretical approach to Markov decision processes and stochastic discrete optimal control problems. In closing, the book's final chapter is devoted to finite horizon stochastic con...

  5. The Computer as a Tool for Learning through Reflection.

    Science.gov (United States)

    1986-03-01

    the reasons why he made certain choices (see Bundy, 1983), a simple and rewarding excercise if the annotation menu has built into it strategic terms...VA 22314 Dr John Tangney Dr Derek Sleeman AFOSRiNL Dept. of Computing Science Bolling AFB. DC 20332 King’s College Old Aberdeen AB9 2KND UNITED

  6. Coordinated computer-supported collaborative learning: Awareness and awareness tools

    NARCIS (Netherlands)

    Janssen, J.J.H.M.|info:eu-repo/dai/nl/242063667; Bodermer, D.

    2013-01-01

    Traditionally, research on awareness during online collaboration focused on topics such as the effects of spatial information about group members’ activities on the collaborative process. When the concept of awareness was introduced to computer-supported collaborative learning, this focus shifted to

  7. A Tool for Measuring and Analyzing End User Computing Abilities.

    Science.gov (United States)

    Cheney, Paul H.; Nelson, R. Ryan

    1988-01-01

    Discusses the need for a method to measure computer user abilities, and presents an instrument that identifies individuals' technical, modeling, and applications skills. An evaluation of the instrument in terms of validity and reliability is reported, and applications in user education research are suggested. (12 references) (CLB)

  8. Development of Computer Aided Database Design and Maintenance Tools.

    Science.gov (United States)

    1984-12-01

    None * PROCEDURE HIGH-LOW BEGIN USE QUICK SORT METHOD FOUND IN FUNDAMENTALS OF DATA STRUCTURES, HOROWITZ and SAHNI, COMPUTER SCIENCE PRESS, 1976, pp...CLASSIFICATION OF THIS PAGE REPORT DOCUMENTATION PAGE 1& REPORT SEth v IEJ ATION lb. RESTRICTIVE MARKINGS 2&. SECURITY CLASSIFICATION AUTHORITY 3

  9. Computer modelling as a tool for understanding language evolution

    NARCIS (Netherlands)

    de Boer, Bart; Gontier, N; VanBendegem, JP; Aerts, D

    2006-01-01

    This paper describes the uses of computer models in studying the evolution of language. Language is a complex dynamic system that can be studied at the level of the individual and at the level of the population. Much of the dynamics of language evolution and language change occur because of the

  10. Computer-based Training: A Tool of Sexual Harassment Policy.

    Science.gov (United States)

    Wellbrock, Richard D.

    1999-01-01

    Summarizes the concepts and lead issues involved in the need for a well-stated sexual harassment policy in a community college environment. Links computer-based training to successful policy implementation and provides an overview of software designed to instruct employees about an institution's sexual harassment policies. (Contains 17…

  11. Computer Vision Tools for Finding Images and Video Sequences.

    Science.gov (United States)

    Forsyth, D. A.

    1999-01-01

    Computer vision offers a variety of techniques for searching for pictures in large collections of images. Appearance methods compare images based on the overall content of the image using certain criteria. Finding methods concentrate on matching subparts of images, defined in a variety of ways, in hope of finding particular objects. These ideas…

  12. Adaption of computers in Dutch Museums: interpreting the new tool

    NARCIS (Netherlands)

    Navarrete, Trilce

    2015-01-01

    abstractThe adoption of computers in Dutch museums has been marked by the changing technology as much as by the interpretation of what the technology is meant to do. The Social Construction of Technology framework is used to review the adoption of a digital work method and to highlight the

  13. Computer Aided Model Development for Automatic Tool Wear ...

    African Journals Online (AJOL)

    The pre-processing operations on the images (taken on photographic cards) included scanning, in order to transfer onto a computer and convert them to digital images. Thresholding and segmentation were done in order to convert the altered background of the scanned images to a pure white background; the images were ...

  14. Eye tracking using artificial neural networks for human computer interaction.

    Science.gov (United States)

    Demjén, E; Aboši, V; Tomori, Z

    2011-01-01

    This paper describes an ongoing project that has the aim to develop a low cost application to replace a computer mouse for people with physical impairment. The application is based on an eye tracking algorithm and assumes that the camera and the head position are fixed. Color tracking and template matching methods are used for pupil detection. Calibration is provided by neural networks as well as by parametric interpolation methods. Neural networks use back-propagation for learning and bipolar sigmoid function is chosen as the activation function. The user's eye is scanned with a simple web camera with backlight compensation which is attached to a head fixation device. Neural networks significantly outperform parametric interpolation techniques: 1) the calibration procedure is faster as they require less calibration marks and 2) cursor control is more precise. The system in its current stage of development is able to distinguish regions at least on the level of desktop icons. The main limitation of the proposed method is the lack of head-pose invariance and its relative sensitivity to illumination (especially to incidental pupil reflections).

  15. Optimization on a Network-based Parallel Computer System for Supersonic Laminar Wing Design

    Science.gov (United States)

    Garcia, Joseph A.; Cheung, Samson; Holst, Terry L. (Technical Monitor)

    1995-01-01

    A set of Computational Fluid Dynamics (CFD) routines and flow transition prediction tools are integrated into a network based parallel numerical optimization routine. Through this optimization routine, the design of a 2-D airfoil and an infinitely swept wing will be studied in order to advance the design cycle capability of supersonic laminar flow wings. The goal of advancing supersonic laminar flow wing design is achieved by wisely choosing the design variables used in the optimization routine. The design variables are represented by the theory of Fourier series and potential theory. These theories, combined with the parallel CFD flow routines and flow transition prediction tools, provide a design space for a global optimal point to be searched. Finally, the parallel optimization routine enables gradient evaluations to be performed in a fast and parallel fashion.

  16. Social Networking as a Tool for Lifelong Learning with Orthopedically Impaired Learners

    National Research Council Canada - National Science Library

    Metin Ersoy; Ahmet Güneyli

    2016-01-01

      This paper discusses how Turkish Cypriot orthopedically impaired learners who are living in North Cyprus use social networking as a tool for leisure and education, and to what extent they satisfy...

  17. FILTSoft: A computational tool for microstrip planar filter design

    Science.gov (United States)

    Elsayed, M. H.; Abidin, Z. Z.; Dahlan, S. H.; Cholan N., A.; Ngu, Xavier T. I.; Majid, H. A.

    2017-09-01

    Filters are key component of any communication system to control spectrum and suppress interferences. Designing a filter involves long process as well as good understanding of the basic hardware technology. Hence this paper introduces an automated design tool based on Matlab-GUI, called the FILTSoft (acronym for Filter Design Software) to ease the process. FILTSoft is a user friendly filter design tool to aid, guide and expedite calculations from lumped elements level to microstrip structure. Users just have to provide the required filter specifications as well as the material description. FILTSoft will calculate and display the lumped element details, the planar filter structure, and the expected filter's response. An example of a lowpass filter design was calculated using FILTSoft and the results were validated through prototype measurement for comparison purposes.

  18. Innovative education networking aimed at multimedia tools for geometrical optics learning

    Science.gov (United States)

    García-Martínez, P.; Zapata-Rodríguez, C. J.; Ferreira, C.; Fernández, I.; Pastor, D.; Nasenpour, M.; Moreno, I.; Sánchez-López, M. M.; Espinosa, J.; Mas, D.; Miret, J. J.

    2015-10-01

    We present a purposeful initiative to open new grounds for teaching Geometrical Optics. It is based on the creation of an innovative education networking involving academic staff from three Spanish universities linked together around Optics. Nowadays, students demand online resources such as innovative multimedia tools for complementing the understanding of their studies. Geometrical Optics relies on basics of light phenomena like reflection and refraction and the use of simple optical elements such as mirrors, prisms, lenses, and fibers. The mathematical treatment is simple and the equations are not too complicated. But from our long time experience in teaching to undergraduate students, we realize that important concepts are missed by these students because they do not work ray tracing as they should do. Moreover, Geometrical Optics laboratory is crucial by providing many short Optics experiments and thus stimulating students interest in the study of such a topic. Multimedia applications help teachers to cover those student demands. In that sense, our educational networking shares and develops online materials based on 1) video-tutorials of laboratory experiences and of ray tracing exercises, 2) different online platforms for student self-examinations and 3) computer assisted geometrical optics exercises. That will result in interesting educational synergies and promote student autonomy for learning Optics.

  19. Tools for Structured Matrix Computations : Stratifications and Coupled Sylvester Equations

    OpenAIRE

    Dmytryshyn, Andrii

    2015-01-01

    Developing theory, algorithms, and software tools for analyzing matrix pencils whose matrices have various structures are contemporary research problems. Such matrices are often coming from discretizations of systems of differential-algebraic equations. Therefore preserving the structures in the simulations as well as during the analyses of the mathematical models typically means respecting their physical meanings and may be crucial for the applications. This leads to a fast development of st...

  20. Sequential computation of elementary modes and minimal cut sets in genome-scale metabolic networks using alternate integer linear programming

    Energy Technology Data Exchange (ETDEWEB)

    Song, Hyun-Seob; Goldberg, Noam; Mahajan, Ashutosh; Ramkrishna, Doraiswami

    2017-03-27

    Elementary (flux) modes (EMs) have served as a valuable tool for investigating structural and functional properties of metabolic networks. Identification of the full set of EMs in genome-scale networks remains challenging due to combinatorial explosion of EMs in complex networks. It is often, however, that only a small subset of relevant EMs needs to be known, for which optimization-based sequential computation is a useful alternative. Most of the currently available methods along this line are based on the iterative use of mixed integer linear programming (MILP), the effectiveness of which significantly deteriorates as the number of iterations builds up. To alleviate the computational burden associated with the MILP implementation, we here present a novel optimization algorithm termed alternate integer linear programming (AILP). Results: Our algorithm was designed to iteratively solve a pair of integer programming (IP) and linear programming (LP) to compute EMs in a sequential manner. In each step, the IP identifies a minimal subset of reactions, the deletion of which disables all previously identified EMs. Thus, a subsequent LP solution subject to this reaction deletion constraint becomes a distinct EM. In cases where no feasible LP solution is available, IP-derived reaction deletion sets represent minimal cut sets (MCSs). Despite the additional computation of MCSs, AILP achieved significant time reduction in computing EMs by orders of magnitude. The proposed AILP algorithm not only offers a computational advantage in the EM analysis of genome-scale networks, but also improves the understanding of the linkage between EMs and MCSs.

  1. Some issues related to simulation of the tracking and communications computer network

    Science.gov (United States)

    Lacovara, Robert C.

    1989-01-01

    The Communications Performance and Integration branch of the Tracking and Communications Division has an ongoing involvement in the simulation of its flight hardware for Space Station Freedom. Specifically, the communication process between central processor(s) and orbital replaceable units (ORU's) is simulated with varying degrees of fidelity. The results of investigations into three aspects of this simulation effort are given. The most general area involves the use of computer assisted software engineering (CASE) tools for this particular simulation. The second area of interest is simulation methods for systems of mixed hardware and software. The final area investigated is the application of simulation methods to one of the proposed computer network protocols for space station, specifically IEEE 802.4.

  2. Computationally efficient measure of topological redundancy of biological and social networks

    Science.gov (United States)

    Albert, Réka; Dasgupta, Bhaskar; Hegde, Rashmi; Sivanathan, Gowri Sangeetha; Gitter, Anthony; Gürsoy, Gamze; Paul, Pradyut; Sontag, Eduardo

    2011-09-01

    It is well known that biological and social interaction networks have a varying degree of redundancy, though a consensus of the precise cause of this is so far lacking. In this paper, we introduce a topological redundancy measure for labeled directed networks that is formal, computationally efficient, and applicable to a variety of directed networks such as cellular signaling, and metabolic and social interaction networks. We demonstrate the computational efficiency of our measure by computing its value and statistical significance on a number of biological and social networks with up to several thousands of nodes and edges. Our results suggest a number of interesting observations: (1) Social networks are more redundant that their biological counterparts, (2) transcriptional networks are less redundant than signaling networks, (3) the topological redundancy of the C. elegans metabolic network is largely due to its inclusion of currency metabolites, and (4) the redundancy of signaling networks is highly (negatively) correlated with the monotonicity of their dynamics.

  3. Electricity market price forecasting by grid computing optimizing artificial neural networks

    OpenAIRE

    Niimura, T.; Ozawa, K.; Sakamoto, N.

    2007-01-01

    This paper presents a grid computing approach to parallel-process a neural network time-series model for forecasting electricity market prices. A grid computing environment introduced in a university computing laboratory provides access to otherwise underused computing resources. The grid computing of the neural network model not only processes several times faster than a single iterative process, but also provides chances of improving forecasting accuracy. Results of numerical tests using re...

  4. PCE: web tools to compute protein continuum electrostatics

    Science.gov (United States)

    Miteva, Maria A.; Tufféry, Pierre; Villoutreix, Bruno O.

    2005-01-01

    PCE (protein continuum electrostatics) is an online service for protein electrostatic computations presently based on the MEAD (macroscopic electrostatics with atomic detail) package initially developed by D. Bashford [(2004) Front Biosci., 9, 1082–1099]. This computer method uses a macroscopic electrostatic model for the calculation of protein electrostatic properties, such as pKa values of titratable groups and electrostatic potentials. The MEAD package generates electrostatic energies via finite difference solution to the Poisson–Boltzmann equation. Users submit a PDB file and PCE returns potentials and pKa values as well as color (static or animated) figures displaying electrostatic potentials mapped on the molecular surface. This service is intended to facilitate electrostatics analyses of proteins and thereby broaden the accessibility to continuum electrostatics to the biological community. PCE can be accessed at . PMID:15980492

  5. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  6. Present status of computational tools for maglev development

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  7. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are adequately...

  8. Bridging the Gap: Using Interactive Computer Tools To Build Fraction Schemes.

    Science.gov (United States)

    Olive, John

    2002-01-01

    Explores ways to help children make connections between whole-number multiplication and their notion of a fraction. Illustrates an approach to constructing fraction concepts that builds on children's whole-number knowledge using specially designed computer tools. (KHR)

  9. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a predictive computational tool for the aerothermal environment around ablation-cooled hypersonic atmospheric entry...

  10. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a computational tool with unique predictive capabilities for the aerothermodynamic environment around ablation-cooled...

  11. Performance Evaluation of 5G Millimeter-Wave Cellular Access Networks Using a Capacity-Based Network Deployment Tool

    Directory of Open Access Journals (Sweden)

    Michel Matalatala

    2017-01-01

    Full Text Available The next fifth generation (5G of wireless communication networks comes with a set of new features to satisfy the demand of data-intensive applications: millimeter-wave frequencies, massive antenna arrays, beamforming, dense cells, and so forth. In this paper, we investigate the use of beamforming techniques through various architectures and evaluate the performance of 5G wireless access networks, using a capacity-based network deployment tool. This tool is proposed and applied to a realistic area in Ghent, Belgium, to simulate realistic 5G networks that respond to the instantaneous bit rate required by the active users. The results show that, with beamforming, 5G networks require almost 15% more base stations and 4 times less power to provide more capacity to the users and the same coverage performances, in comparison with the 4G reference network. Moreover, they are 3 times more energy efficient than the 4G network and the hybrid beamforming architecture appears to be a suitable architecture for beamforming to be considered when designing a 5G cellular network.

  12. Security Enhanced Multi-Domain Network Management for Joint Warrior Interoperability Demonstration (JWID)

    National Research Council Canada - National Science Library

    Marcinkowski, James; Miller, Roger

    2005-01-01

    .... To assist administrative personnel in maintaining and monitoring a computer network, commercial network management tools are used that collect data regarding network functionality and availability...

  13. GFI Network Security and PCI Compliance Power Tools

    CERN Document Server

    Posey, Brien

    2008-01-01

    Today all companies, U.S. federal agencies, and non-profit organizations have valuable data on their servers that needs to be secured. One of the challenges for IT experts is learning how to use new products in a time-efficient manner, so that new implementations can go quickly and smoothly. Learning how to set up sophisticated products is time-consuming, and can be confusing. GFI's LANguard Network Security Scanner reports vulnerabilities so that they can be mitigated before unauthorized intruders can wreck havoc on your network. To take advantage of the best things that GFI's LANguard Networ

  14. Synthesize, optimize, analyze, repeat (SOAR): Application of neural network tools to ECG patient monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Watrous, R.; Towell, G.; Glassman, M.S. [Siemens Corporate Research, Princeton, NJ (United States)

    1995-12-31

    Results are reported from the application of tools for synthesizing, optimizing and analyzing neural networks to an ECG Patient Monitoring task. A neural network was synthesized from a rule-based classifier and optimized over a set of normal and abnormal heartbeats. The classification error rate on a separate and larger test set was reduced by a factor of 2. When the network was analyzed and reduced in size by a factor of 40%, the same level of performance was maintained.

  15. Evaluation of Internet Social Networks using Net scoring Tool: A Case Study in Adverse Drug Reaction Mining.

    Science.gov (United States)

    Katsahian, Sandrine; Simond Moreau, Erica; Leprovost, Damien; Lardon, Jeremy; Bousquet, Cedric; Kerdelhué, Gaétan; Abdellaoui, Redhouane; Texier, Nathalie; Burgun, Anita; Boussadi, Abdelali; Faviez, Carole

    2015-01-01

    Suspected adverse drug reactions (ADR) reported by patients through social media can be a complementary tool to already existing ADRs signal detection processes. However, several studies have shown that the quality of medical information published online varies drastically whatever the health topic addressed. The aim of this study is to use an existing rating tool on a set of social network web sites in order to assess the capabilities of these tools to guide experts for selecting the most adapted social network web site to mine ADRs. First, we reviewed and rated 132 Internet forums and social networks according to three major criteria: the number of visits, the notoriety of the forum and the number of messages posted in relation with health and drug therapy. Second, the pharmacist reviewed the topic-oriented message boards with a small number of drug names to ensure that they were not off topic. Six experts have been chosen to assess the selected internet forums using a French scoring tool: Net scoring. Three different scores and the agreement between experts according to each set of scores using weighted kappa pooled using mean have been computed. Three internet forums were chosen at the end of the selection step. Some criteria get high score (scores 3-4) no matter the website evaluated like accessibility (45-46) or design (34-36), at the opposite some criteria always have bad scores like quantitative (40-42) and ethical aspect (43-44), hyperlinks actualization (30-33). Kappa were positives but very small which corresponds to a weak agreement between experts. The personal opinion of the expert seems to have a major impact, undermining the relevance of the criterion. Our future work is to collect results given by this evaluation grid and proposes a new scoring tool for Internet social networks assessment.

  16. DIMACS Workshop on Interconnection Networks and Mapping, and Scheduling Parallel Computations

    CERN Document Server

    Rosenberg, Arnold L; Sotteau, Dominique; NSF Science and Technology Center in Discrete Mathematics and Theoretical Computer Science; Interconnection networks and mapping and scheduling parallel computations

    1995-01-01

    The interconnection network is one of the most basic components of a massively parallel computer system. Such systems consist of hundreds or thousands of processors interconnected to work cooperatively on computations. One of the central problems in parallel computing is the task of mapping a collection of processes onto the processors and routing network of a parallel machine. Once this mapping is done, it is critical to schedule computations within and communication among processor from universities and laboratories, as well as practitioners involved in the design, implementation, and application of massively parallel systems. Focusing on interconnection networks of parallel architectures of today and of the near future , the book includes topics such as network topologies,network properties, message routing, network embeddings, network emulation, mappings, and efficient scheduling. inputs for a process are available where and when the process is scheduled to be computed. This book contains the refereed pro...

  17. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools.

    Science.gov (United States)

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  18. Synthetic RNAs for gene regulation: design principles and computational tools

    Directory of Open Access Journals (Sweden)

    Alessandro eLaganà

    2014-12-01

    Full Text Available The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies, but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis and the evaluation of RNAi agents such as small-interfering RNA (siRNA, short-hairpin RNA (shRNA, artificial microRNA (a-miR and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats, was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  19. Towards a Tool for Computer Supported Structuring of Products

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp

    1997-01-01

    When an engineering designer creates a product he has to consider not only functionality, also life cycle concerns such as manufacturing, service, and recycling must be taken into account. The product structure plays a large role for the product's performance and suitability for the life phases....... However, a product possesses not only a component structure but also various organ structures which are superimposed on the component structure. The organ structures carry behaviour and make the product suited for its life phases.Our long-term research goal is to develop a computer-based system...

  20. Computer-assisted Learning as an Educational Tool

    OpenAIRE

    Lassen, Laura Signe; Troelsen, Lasse Blom; Andersen, Jesper Hjøllund; Schou, Kristoffer Carl

    2015-01-01

    This report examines the possibilities of the construction of a serious educational computer game. The game is minded at supporting a differentiated instruction through methods from Vygotsky’s Zone of Proximal Development and Bruner’s Scaffolding. Through an analysis of the State of the Art we’ve pointed out some specific affordances to include in our constructed game. After finishing the construction of the game we tested it in a 5th grade,and lastly discussed our design goals with the outco...

  1. Development of Tools for DER Components in a Distribution Network

    DEFF Research Database (Denmark)

    Mihet-Popa, Lucian; Koch-Ciobotaru, C; Isleifsson, Fridrik Rafn

    2012-01-01

    The increasing amount of Distributed Energy Resources (DER) components into distribution networks involves the development of accurate simulation models that take into account an increasing number of factors that influence the output power from the DG systems. This paper presents two simulation m...

  2. Faculty Use of Author Identifiers and Researcher Networking Tools

    Science.gov (United States)

    Tran, Clara Y.; Lyon, Jennifer A.

    2017-01-01

    This cross-sectional survey focused on faculty use and knowledge of author identifiers and researcher networking systems, and professional use of social media, at a large state university. Results from 296 completed faculty surveys representing all disciplines (9.3% response rate) show low levels of awareness and variable resource preferences. The…

  3. Social Networking Tools to Facilitate Cross-Program Collaboration

    Science.gov (United States)

    Wallace, Paul; Howard, Barbara

    2010-01-01

    Students working on a highly collaborative project used social networking technology for community building activities as well as basic project-related communication. Requiring students to work on cross-program projects gives them real-world experience working in diverse, geographically dispersed groups. An application used at Appalachian State…

  4. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  5. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  6. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    Science.gov (United States)

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  7. The UEA Small RNA Workbench: A Suite of Computational Tools for Small RNA Analysis.

    Science.gov (United States)

    Mohorianu, Irina; Stocks, Matthew Benedict; Applegate, Christopher Steven; Folkes, Leighton; Moulton, Vincent

    2017-01-01

    RNA silencing (RNA interference, RNAi) is a complex, highly conserved mechanism mediated by short, typically 20-24 nt in length, noncoding RNAs known as small RNAs (sRNAs). They act as guides for the sequence-specific transcriptional and posttranscriptional regulation of target mRNAs and play a key role in the fine-tuning of biological processes such as growth, response to stresses, or defense mechanism.High-throughput sequencing (HTS) technologies are employed to capture the expression levels of sRNA populations. The processing of the resulting big data sets facilitated the computational analysis of the sRNA patterns of variation within biological samples such as time point experiments, tissue series or various treatments. Rapid technological advances enable larger experiments, often with biological replicates leading to a vast amount of raw data. As a result, in this fast-evolving field, the existing methods for sequence characterization and prediction of interaction (regulatory) networks periodically require adapting or in extreme cases, a complete redesign to cope with the data deluge. In addition, the presence of numerous tools focused only on particular steps of HTS analysis hinders the systematic parsing of the results and their interpretation.The UEA small RNA Workbench (v1-4), described in this chapter, provides a user-friendly, modular, interactive analysis in the form of a suite of computational tools designed to process and mine sRNA datasets for interesting characteristics that can be linked back to the observed phenotypes. First, we show how to preprocess the raw sequencing output and prepare it for downstream analysis. Then we review some quality checks that can be used as a first indication of sources of variability between samples. Next we show how the Workbench can provide a comparison of the effects of different normalization approaches on the distributions of expression, enhanced methods for the identification of differentially expressed

  8. Configuration monitoring tool for large-scale distributed computing

    CERN Document Server

    Wu, Y; Fisk, I; Graham, G; Kim, B J; Lü, X

    2004-01-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN will likely use a grid system to achieve much of its offline processing need. Given the heterogeneous and dynamic nature of grid systems, it is desirable to have in place a configuration monitor. The configuration monitoring tool is built using the Globus toolkit and web services. It consists of an information provider for the Globus MDS, a relational database for keeping track of the current and old configurations, and client interfaces to query and administer the configuration system. The Grid Security Infrastructure (GSI), together with EDG Java Security packages, are used for secure authentication and transparent access to the configuration information across the CMS grid. This work has been prototyped and tested using US-CMS grid resources.

  9. Platformation: Cloud Computing Tools at the Service of Social Change

    Directory of Open Access Journals (Sweden)

    Anil Patel

    2012-07-01

    Full Text Available The following article establishes some context and definitions for what is termed the “sharing imperative” – a movement or tendency towards sharing information online and in real time that has rapidly transformed several industries. As internet-enabled devices proliferate to all corners of the globe, ways of working and accessing information have changed. Users now expect to be able to access the products, services, and information that they want from anywhere, at any time, on any device. This article addresses how the nonprofit sector might respond to those demands by embracing the sharing imperative. It suggests that how well an organization shares has become one of the most pressing governance questions a nonprofit organization must tackle. Finally, the article introduces Platformation, a project whereby tools that enable better inter and intra-organizational sharing are tested for scalability, affordability, interoperability, and security, all with a non-profit lens.

  10. Dynamic Security Assessment Of Computer Networks In Siem-Systems

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Doynikova

    2015-10-01

    Full Text Available The paper suggests an approach to the security assessment of computer networks. The approach is based on attack graphs and intended for Security Information and Events Management systems (SIEM-systems. Key feature of the approach consists in the application of the multilevel security metrics taxonomy. The taxonomy allows definition of the system profile according to the input data used for the metrics calculation and techniques of security metrics calculation. This allows specification of the security assessment in near real time, identification of previous and future attacker steps, identification of attackers goals and characteristics. A security assessment system prototype is implemented for the suggested approach. Analysis of its operation is conducted for several attack scenarios.

  11. Computational Genetic Regulatory Networks Evolvable, Self-organizing Systems

    CERN Document Server

    Knabe, Johannes F

    2013-01-01

    Genetic Regulatory Networks (GRNs) in biological organisms are primary engines for cells to enact their engagements with environments, via incessant, continually active coupling. In differentiated multicellular organisms, tremendous complexity has arisen in the course of evolution of life on earth. Engineering and science have so far achieved no working system that can compare with this complexity, depth and scope of organization. Abstracting the dynamics of genetic regulatory control to a computational framework in which artificial GRNs in artificial simulated cells differentiate while connected in a changing topology, it is possible to apply Darwinian evolution in silico to study the capacity of such developmental/differentiated GRNs to evolve. In this volume an evolutionary GRN paradigm is investigated for its evolvability and robustness in models of biological clocks, in simple differentiated multicellularity, and in evolving artificial developing 'organisms' which grow and express an ontogeny starting fr...

  12. Computers and networks in the age of globalization

    DEFF Research Database (Denmark)

    Bloch Rasmussen, Leif; Beardon, Colin; Munari, Silvio

    In modernity, an individual identity was constituted from civil society, while in a globalized network society, human identity, if it develops at all, must grow from communal resistance. A communal resistance to an abstract conceptualized world, where there is no possibility for perception...... and experience of power and therefore no possibility for human choice and action, is of utmost importance for the constituting of human choosers and actors. This book therefore sets focus on those human choosers and actors wishing to read and enjoy the papers as they are actually perceiving and experiencing...... for Information Processing (IFIP) and held in Geneva, Switzerland in August 1998. Since the first HCC conference in 1974, IFIP's Technical Committee 9 has endeavoured to set the agenda for human choices and human actions vis-a-vis computers....

  13. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    Science.gov (United States)

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  14. The computer in the mathematics classroom: the tool, the tutor and ...

    African Journals Online (AJOL)

    This paper seeks to enlighten other educational practitioners that there is more that a computer can do for a teacher in the classroom. Robert Taylor's three modes of computer applications in the classroom (the tool, tutor, tutee) are discussed here. African Journal of Educational Studies in Mathematics and Sciences Vol.

  15. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    Science.gov (United States)

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  16. Radon concentration: A tool for assessing the fracture network at ...

    African Journals Online (AJOL)

    drinie

    2003-01-01

    Jan 1, 2003 ... geothermal systems. In: Ivanovich M and Harmon RS(eds.) Uranium. Series Disequilibrium: Applications to Earth, Marine and Environ- mental Sciences. Clarendon Press, Oxford. 631-668. LEVIN M (2000 ) The radon emanation technique as a tool in ground water exploration. Borehole Water J. 46 22-26.

  17. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  18. Use of Social Networks as an Educational Tool

    OpenAIRE

    Tiryakioglu, Filiz; Erzurum, Funda

    2011-01-01

    Social network, particularly Facebook, can be defined as a unique online service, platform, or area where social communication and/or social relations can be established and individuals intensely share information. This definition implies that communication specialists should have more expertise and interest in social media than any other group of experts. Based on this assumption, the present study investigated the views and attitudes of instructors in the Faculty of Communication Sciences a...

  19. Professional Readiness of Teachers to Use Computer Visualization Tools: A Crucial Drive

    Directory of Open Access Journals (Sweden)

    Elena V. Semenikhina

    2016-12-01

    Full Text Available The training of teachers involves the formation of skills which are meant to be used in their future professional activities. Given the exponential increase in information content, there is a need to look into the levels and components of the professional readiness of teachers to use computer visualization tools. This article describes the four levels of teachers’ readiness [passive, basic, conscious, active] to use computer visualization tools. These levels are based on the proposed components of teachers’ readiness [motivational, cognitive, technological, reflexive] to use these tools.

  20. Computationally Inexpensive Incorporation of Solute Transport Physics into Pore-Network Models

    Science.gov (United States)

    Mehmani, Y.; Oostrom, M.

    2014-12-01

    Several modeling approaches have been developed in the literature for simulating solute transport at the pore scale. This includes "direct modeling" where the fundamental equations are solved directly on the actual pore-scale geometry (obtained from digital images). These methods, even though very accurate, come at a high computational cost. A pore-network representation of the pore-scale geometry is a first step in reducing the computational cost. However, the geometric simplification is typically accompanied by a secondary simplification of the physics of the problem (contributing to their inaccuracy). This is seen in the widely-used "mixed-cell method" which has simplifications in two key components: 1) intra-pore mixing, and 2) inter-pore rate expressions. Nevertheless, the method is popular because it is computationally inexpensive, allowing for examining larger and more representative computational domains. In this work, we explore two novel methods for circumventing the aforementioned limitations of the mixed-cell method (intra-pore mixing and inter-pore rate expressions); all while making an effort to keep the computational cost low. We show that while intra-pore mixing can be accurately taken into account, correcting for the inter-pore rate expressions has fundamental implications on the applicability of Eulerian pore-network models and the interpretation of the results obtained therefrom. Despite recent important progress in the development of accurate and robust direct modeling tools, there is a need in the literature for simple, accurate, and inexpensive models both from a scientific as well as a practical point of view.