WorldWideScience

Sample records for automated network analysis

  1. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non......It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...

  2. Automated drawing of network plots in network meta-analysis.

    Science.gov (United States)

    Rücker, Gerta; Schwarzer, Guido

    2016-03-01

    In systematic reviews based on network meta-analysis, the network structure should be visualized. Network plots often have been drawn by hand using generic graphical software. A typical way of drawing networks, also implemented in statistical software for network meta-analysis, is a circular representation, often with many crossing lines. We use methods from graph theory in order to generate network plots in an automated way. We give a number of requirements for graph drawing and present an algorithm that fits prespecified ideal distances between the nodes representing the treatments. The method was implemented in the function netgraph of the R package netmeta and applied to a number of networks from the literature. We show that graph representations with a small number of crossing lines are often preferable to circular representations. PMID:26060934

  3. Taiwan Automated Telescope Network

    OpenAIRE

    Shuhrat Ehgamberdiev; Alexander Serebryanskiy; Antonio Jimenez; Li-Han Wang; Ming-Tsung Sun; Javier Fernandez Fernandez; Dean-Yi Chou

    2010-01-01

    A global network of small automated telescopes, the Taiwan Automated Telescope (TAT) network, dedicated to photometric measurements of stellar pulsations, is under construction. Two telescopes have been installed in Teide Observatory, Tenerife, Spain and Maidanak Observatory, Uzbekistan. The third telescope will be installed at Mauna Loa Observatory, Hawaii, USA. Each system uses a 9-cm Maksutov-type telescope. The effective focal length is 225 cm, corresponding to an f-ratio of 25. The field...

  4. Artificial neural networks for automation of Rutherford backscattering spectroscopy experiments and data analysis

    International Nuclear Information System (INIS)

    We present an algorithm based on artificial neural networks able to determine optimized experimental conditions for Rutherford backscattering measurements of Ge-implanted Si. The algorithm can be implemented for any other element implanted into a lighter substrate. It is foreseeable that the method developed in this work can be applied to still many other systems. The algorithm presented is a push-button black box, and does not require any human intervention. It is thus suited for automated control of an experimental setup, given an interface to the relevant hardware. Once the experimental conditions are optimized, the algorithm analyzes the final data obtained, and determines the desired parameters. The method is thus also suited for automated analysis of the data. The algorithm presented can be easily extended to other ion beam analysis techniques. Finally, it is suggested how the artificial neural networks required for automated control and analysis of experiments could be automatically generated. This would be suited for automated generation of the required computer code. Thus could RBS be done without experimentalists, data analysts, or programmers, with only technicians to keep the machines running

  5. Automated condition classification of a reciprocating compressor using time frequency analysis and an artificial neural network

    Science.gov (United States)

    Lin, Yih-Hwang; Wu, Hsien-Chang; Wu, Chung-Yung

    2006-12-01

    The purpose of this study is to develop an automated system for condition classification of a reciprocating compressor. Various time-frequency analysis techniques will be examined for decomposition of the vibration signals. Because a time-frequency distribution is a 3D data map, data reduction is indispensable for subsequent analysis. The extraction of the system characteristics using three indices, namely the time index, frequency index, and amplitude index, will be presented and examined for their applicability. The probability neural network is applied for automated condition classification using a combination of the three indices. The study reveals that a proper choice of the index combination and the time-frequency band can provide excellent classification accuracy for the machinery conditions examined in this work.

  6. Network based automation for SMEs

    DEFF Research Database (Denmark)

    Shahabeddini Parizi, Mohammad; Radziwon, Agnieszka

    2016-01-01

    The implementation of appropriate automation concepts which increase productivity in Small and Medium Sized Enterprises (SMEs) requires a lot of effort, due to their limited resources. Therefore, it is strongly recommended for small firms to open up for the external sources of knowledge, which...... could be obtained through network interaction. Based on two extreme cases of SMEs representing low-tech industry and an in-depth analysis of their manufacturing facilities this paper presents how collaboration between firms embedded in a regional ecosystem could result in implementation of new...... other members of the same regional ecosystem. The findings highlight two main automation related areas where manufacturing SMEs could leverage on external sources on knowledge – these are assistance in defining automation problem as well as appropriate solution and provider selection. Consequently, this...

  7. REMOTE CONTROL OF AN INDUCTION MACHINE WITH INDUSTRIAL AUTOMATION NETWORK AND THE PERFORMANCE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Nihat ÖZTÜRK

    2007-02-01

    Full Text Available In this study, a Profibus network based industrial automation system has been designed and it has been controlled on the spinal network of three phase induction machine. Delay occurred on the network has been examined during the speed control via the network. It has been determined that the network delay that occurred changes depending on data traffic on the network. Delay that occurred has been regarded as it is in the acceptable limits of maximum network-induced delay in motion systems.

  8. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many...

  9. Automated minimax design of networks

    DEFF Research Database (Denmark)

    Madsen, Kaj; Schjær-Jacobsen, Hans; Voldby, J

    1975-01-01

    A new gradient algorithm for the solution of nonlinear minimax problems has been developed. The algorithm is well suited for automated minimax design of networks and it is very simple to use. It compares favorably with recent minimax and leastpth algorithms. General convergence problems related...... to minimax design of networks are discussed. Finally, minimax design of equalization networks for reflectiontype microwave amplifiers is carried out by means of the proposed algorithm....

  10. Taiwan Automated Telescope Network

    Directory of Open Access Journals (Sweden)

    Dean-Yi Chou

    2010-01-01

    can be operated either interactively or fully automatically. In the interactive mode, it can be controlled through the Internet. In the fully automatic mode, the telescope operates with preset parameters without any human care, including taking dark frames and flat frames. The network can also be used for studies that require continuous observations for selected objects.

  11. An expert diagnostic system based on neural networks and image analysis techniques in the field of automated cytogenetics.

    Science.gov (United States)

    Beksaç, M S; Eskiizmirliler, S; Cakar, A N; Erkmen, A M; Dağdeviren, A; Lundsteen, C

    1996-03-01

    In this study, we introduce an expert system for intelligent chromosome recognition and classification based on artificial neural networks (ANN) and features obtained by automated image analysis techniques. A microscope equipped with a CCTV camera, integrated with an IBM-PC compatible computer environment including a frame grabber, is used for image data acquisition. Features of the chromosomes are obtained directly from the digital chromosome images. Two new algorithms for automated object detection and object skeletonizing constitute the basis of the feature extraction phase which constructs the components of the input vector to the ANN part of the system. This first version of our intelligent diagnostic system uses a trained unsupervised neural network structure and an original rule-based classification algorithm to find a karyotyped form of randomly distributed chromosomes over a complete metaphase. We investigate the effects of network parameters on the classification performance and discuss the adaptability and flexibility of the neural system in order to reach a structure giving an output including information about both structural and numerical abnormalities. Moreover, the classification performances of neural and rule-based system are compared for each class of chromosome. PMID:8705397

  12. Neural Network Based Boolean Factor Analysis: Efficient Tool for Automated Topics Search.

    Czech Academy of Sciences Publication Activity Database

    Húsek, Dušan; Frolov, A. A.; Polyakov, P.Y.; Řezanková, H.

    Amman: Applied Science Private University, 2006 - (Issa, G.; El-Qawasmeh, E.; Raho, G.), s. 321-327 ISBN 9957-8592-0-X. [CSIT 2006. International Multiconference on Computer Science and Information Technology /4./. Amman (JO), 05.04.2006-07.04.2006] R&D Projects: GA AV ČR 1ET100300419 Institutional research plan: CEZ:AV0Z10300504 Keywords : Boolean factor analysis * neural networks * associative memory * clustering * web searching * semantic web * information retrieval * document indexing * document classification * document processing * data mining * machine learning Subject RIV: BB - Applied Statistics, Operational Research

  13. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...

  14. Technological Developments in Networking, Education and Automation

    CERN Document Server

    Elleithy, Khaled; Iskander, Magued; Kapila, Vikram; Karim, Mohammad A; Mahmood, Ausif

    2010-01-01

    "Technological Developments in Networking, Education and Automation" includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the following areas: Computer Networks: Access Technologies, Medium Access Control, Network architectures and Equipment, Optical Networks and Switching, Telecommunication Technology, and Ultra Wideband Communications. Engineering Education and Online Learning: including development of courses and systems for engineering, technical and liberal studies programs; online laboratories; intelligent

  15. A Systematic, Automated Network Planning Method

    DEFF Research Database (Denmark)

    Holm, Jens Åge; Pedersen, Jens Myrup

    2006-01-01

    to consistency and long-term characteristics. The developed method gives significant improvements on these parameters. The case study was conducted as a comparison between an existing network where the traffic was known and a proposed network designed by the developed method. It turned out that the proposed......This paper describes a case study conducted to evaluate the viability of a systematic, automated network planning method. The motivation for developing the network planning method was that many data networks are planned in an adhoc manner with no assurance of quality of the solution with respect...... network performed better than the existing network with regard to the performance measurements used which reflected how well the traffic was routed in the networks and the cost of establishing the networks. Challenges that need to be solved before the developed method can be used to design network...

  16. Controlling high speed automated transport network operations

    OpenAIRE

    de Feijter, R.

    2006-01-01

    This thesis presents a framework for the control of automated guided vehicles (AGVs). The framework implements the transport system as a community of cooperating agents. Besides the architecture and elements of the framework a wide range of infrastructure scene templates is described. These scene templates, ranging from terminal infrastructure to freeways, can be used as building blocks to create a control system for an automated transport network.

  17. Automated document analysis system

    Science.gov (United States)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  18. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  19. Automated Scheduling for NASA's Deep Space Network

    OpenAIRE

    Johnston, Mark D.; Jet Propulsion Laboratory, California Institute of Technology; Tran, Daniel; Jet Propulsion Laboratory, California Institute of Technology; Arroyo, Belinda; Jet Propulsion Laboratory, California Institute of Technology; Sorensen, Sugi; Jet Propulsion Laboratory, California Institute of Technology; Tay, Peter; Jet Propulsion Laboratory, California Institute of Technology; Carruth, Butch; Innovative Productivity Solutions, Inc.; Coffman, Adam; Innovative Productivity Solutions, Inc.; Wallace, Mike; Innovative Productivity Solutions, Inc.

    2014-01-01

    This article describes the DSN scheduling wngine (DSE) component of a new scheduling system being deployed for NASA's deep space network. The DSE provides core automation functionality for scheduling the network, including the interpretation of scheduling requirements expressed by users, their elaboration into tracking passes, and the resolution of conflicts and constraint violations. The DSE incorporates both systematic search and repair-based algorithms, used for different phases and purpos...

  20. Automation of Network Management with Multidisciplinary Concepts

    OpenAIRE

    Shaleeza Sohail

    2010-01-01

    In todays growing and ever changing world of computer networks, management systems need to have the abilitiesof intellectual reasoning, dynamic real time decision making,experience based self-adaptation and improvement. Furthermore,ever increasing size and complexity of computer networks requireautomation for their management systems. Automation minimizeshuman involvement which produces effective and time savingsolutions for proper and dynamic supervision of these largeand heterogeneous netwo...

  1. Automated Experimentation for Ecological Networks

    OpenAIRE

    Lurgi, M.; Robertson, D

    2011-01-01

    BackgroundIn ecological networks, natural communities are studied from a complex systems perspective by representing interactions among species within them in the form of a graph, which is in turn analysed using mathematical tools. Topological features encountered in complex networks have been proved to provide the systems they represent with interesting attributes such as robustness and stability, which in ecological systems translates into the ability of communities to resist perturbations ...

  2. Automated experimentation in ecological networks

    OpenAIRE

    Lurgi, Miguel; Robertson, David

    2011-01-01

    Background In ecological networks, natural communities are studied from a complex systems perspective by representing interactions among species within them in the form of a graph, which is in turn analysed using mathematical tools. Topological features encountered in complex networks have been proved to provide the systems they represent with interesting attributes such as robustness and stability, which in ecological systems translates into the ability of communities to resist perturbations...

  3. Performance assessment of the Tactical Network Analysis and Planning System Plus (TNAPS+) automated planning tool for C4I systems

    OpenAIRE

    Ziegenfuss, Paul C.

    1999-01-01

    The Joint Staff established the Tactical Network Analysis and Planning System Plus (TNAPS+) as the interim joint communications planning and management system. The Marines Command and Control Systems Course and the Army's Joint Task Force System Planning Course both utilize TNAPS+ to conduct tactical C41 network planning in their course requirements. This thesis is a Naval Postgraduate School C41 curriculum practical application of TNAPS+ in an expeditionary Joint Task Force environment, focu...

  4. Optimization-based Method for Automated Road Network Extraction

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, D

    2001-09-18

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction.

  5. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  6. Automating Risk Analysis of Software Design Models

    OpenAIRE

    Maxime Frydman; Guifré Ruiz; Elisa Heymann; Eduardo César; Barton P. Miller

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security e...

  7. Automated activation-analysis system

    International Nuclear Information System (INIS)

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  8. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  9. Home Network Technologies and Automating Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2009-12-01

    sophisticated energy consumers, it has been possible to improve the DR 'state of the art' with a manageable commitment of technical resources on both the utility and consumer side. Although numerous C & I DR applications of a DRAS infrastructure are still in either prototype or early production phases, these early attempts at automating DR have been notably successful for both utilities and C & I customers. Several factors have strongly contributed to this success and will be discussed below. These successes have motivated utilities and regulators to look closely at how DR programs can be expanded to encompass the remaining (roughly) half of the state's energy load - the light commercial and, in numerical terms, the more important residential customer market. This survey examines technical issues facing the implementation of automated DR in the residential environment. In particular, we will look at the potential role of home automation networks in implementing wide-scale DR systems that communicate directly to individual residences.

  10. Automated Wildfire Detection Through Artificial Neural Networks

    Science.gov (United States)

    Miller, Jerry; Borne, Kirk; Thomas, Brian; Huang, Zhenping; Chi, Yuechen

    2005-01-01

    We have tested and deployed Artificial Neural Network (ANN) data mining techniques to analyze remotely sensed multi-channel imaging data from MODIS, GOES, and AVHRR. The goal is to train the ANN to learn the signatures of wildfires in remotely sensed data in order to automate the detection process. We train the ANN using the set of human-detected wildfires in the U.S., which are provided by the Hazard Mapping System (HMS) wildfire detection group at NOAA/NESDIS. The ANN is trained to mimic the behavior of fire detection algorithms and the subjective decision- making by N O M HMS Fire Analysts. We use a local extremum search in order to isolate fire pixels, and then we extract a 7x7 pixel array around that location in 3 spectral channels. The corresponding 147 pixel values are used to populate a 147-dimensional input vector that is fed into the ANN. The ANN accuracy is tested and overfitting is avoided by using a subset of the training data that is set aside as a test data set. We have achieved an automated fire detection accuracy of 80-92%, depending on a variety of ANN parameters and for different instrument channels among the 3 satellites. We believe that this system can be deployed worldwide or for any region to detect wildfires automatically in satellite imagery of those regions. These detections can ultimately be used to provide thermal inputs to climate models.

  11. Flux-P: Automating Metabolic Flux Analysis

    Directory of Open Access Journals (Sweden)

    Birgitta E. Ebert

    2012-11-01

    Full Text Available Quantitative knowledge of intracellular fluxes in metabolic networks is invaluable for inferring metabolic system behavior and the design principles of biological systems. However, intracellular reaction rates can not often be calculated directly but have to be estimated; for instance, via 13C-based metabolic flux analysis, a model-based interpretation of stable carbon isotope patterns in intermediates of metabolism. Existing software such as FiatFlux, OpenFLUX or 13CFLUX supports experts in this complex analysis, but requires several steps that have to be carried out manually, hence restricting the use of this software for data interpretation to a rather small number of experiments. In this paper, we present Flux-P as an approach to automate and standardize 13C-based metabolic flux analysis, using the Bio-jETI workflow framework. Exemplarily based on the FiatFlux software, it demonstrates how services can be created that carry out the different analysis steps autonomously and how these can subsequently be assembled into software workflows that perform automated, high-throughput intracellular flux analysis of high quality and reproducibility. Besides significant acceleration and standardization of the data analysis, the agile workflow-based realization supports flexible changes of the analysis workflows on the user level, making it easy to perform custom analyses.

  12. Application of Artificial Neural Network Modeling to the Analysis of the Automated Radioxenon Sampler-Analyzer State Of Health Sensors OF HEALTH SENSORS

    Energy Technology Data Exchange (ETDEWEB)

    Hayes, James C.; Doctor, Pam G.; Heimbigner, Tom R.; Hubbard, Charles W.; Kangas, Lars J.; Keller, Paul E.; McIntyre, Justin I.; Schrom, Brian T.; Suarez, Reynold

    2006-09-19

    The Automated Radioxenon Analyzer/Sampler (ARSA) is a radioxenon gas collection and analysis system operating autonomously under computer control. The ARSA systems are deployed as part of an international network of sensors, with individual stations feeding radioxenon concentration data to a central data center. Because the ARSA instrument is complex and is often deployed in remote areas, it requires constant self-monitoring to verify that it is operating according to specifications. System performance monitoring is accomplished by over 200 internal sensors, with some values reported to the data center. Several sensors are designated as safety sensors that can automatically shut down the ARSA when unsafe conditions arise. In this case, the data center is advised of the shutdown and the cause, so that repairs may be initiated. The other sensors, called state of health (SOH) sensors, also provide valuable information on the functioning of the ARSA and are particularly useful for detecting impending malfunctions before they occur to avoid unscheduled shutdowns. Any of the sensor readings can be displayed by an ARSA Data Viewer, but interpretation of the data is difficult without specialized technical knowledge not routinely available at the data center. Therefore it would be advantageous to have sensor data automatically evaluated for the precursors of malfunctions and the results transmitted to the data center. Artificial Neural Networks (ANN) are a class of data analysis methods that have shown wide application to monitoring systems with large numbers of information inputs, such as the ARSA. In this work supervised and unsupervised ANN methods were applied to ARSA SOH data recording during normal operation of the instrument, and the ability of ANN methods to predict system state is presented.

  13. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  14. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  15. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  16. Cost Analysis of an Automated and Manual Cataloging and Book Processing System.

    Science.gov (United States)

    Druschel, Joselyn

    1981-01-01

    Cost analysis of an automated network system and a manual system of cataloging and book processing indicates a 20 percent savings using automation. Per unit costs based on the average monthly automation rate are used for comparison. Higher manual system costs are attributed to staff costs. (RAA)

  17. Automated quantitative analysis for pneumoconiosis

    Science.gov (United States)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  18. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  19. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  20. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...

  1. Management issues in automated audit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Hochberg, J.G.; Wilhelmy, S.K.; McClary, J.F.; Christoph, G.G.

    1994-03-01

    This paper discusses management issues associated with the design and implementation of an automated audit analysis system that we use to detect security events. It gives the viewpoint of a team directly responsible for developing and managing such a system. We use Los Alamos National Laboratory`s Network Anomaly Detection and Intrusion Reporter (NADIR) as a case in point. We examine issues encountered at Los Alamos, detail our solutions to them, and where appropriate suggest general solutions. After providing an introduction to NADIR, we explore four general management issues: cost-benefit questions, privacy considerations, legal issues, and system integrity. Our experiences are of general interest both to security professionals and to anyone who may wish to implement a similar system. While NADIR investigates security events, the methods used and the management issues are potentially applicable to a broad range of complex systems. These include those used to audit credit card transactions, medical care payments, and procurement systems.

  2. Automated Pipelines for Spectroscopic Analysis

    CERN Document Server

    Prieto, Carlos Allende

    2016-01-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some flaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10% of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1%. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overvie...

  3. An Automated 3d Indoor Topological Navigation Network Modelling

    Science.gov (United States)

    Jamali, A.; Rahman, A. A.; Boguslawski, P.; Gold, C. M.

    2015-10-01

    Indoor navigation is important for various applications such as disaster management and safety analysis. In the last decade, indoor environment has been a focus of wide research; that includes developing techniques for acquiring indoor data (e.g. Terrestrial laser scanning), 3D indoor modelling and 3D indoor navigation models. In this paper, an automated 3D topological indoor network generated from inaccurate 3D building models is proposed. In a normal scenario, 3D indoor navigation network derivation needs accurate 3D models with no errors (e.g. gap, intersect) and two cells (e.g. rooms, corridors) should touch each other to build their connections. The presented 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. For reducing time and cost of indoor building data acquisition process, Trimble LaserAce 1000 as surveying instrument is used. The modelling results were validated against an accurate geometry of indoor building environment which was acquired using Trimble M3 total station.

  4. Automated neural network-based instrument validation system

    Science.gov (United States)

    Xu, Xiao

    2000-10-01

    In a complex control process, instrument calibration is periodically performed to maintain the instruments within the calibration range, which assures proper control and minimizes down time. Instruments are usually calibrated under out-of-service conditions using manual calibration methods, which may cause incorrect calibration or equipment damage. Continuous in-service calibration monitoring of sensors and instruments will reduce unnecessary instrument calibrations, give operators more confidence in instrument measurements, increase plant efficiency or product quality, and minimize the possibility of equipment damage during unnecessary manual calibrations. In this dissertation, an artificial neural network (ANN)-based instrument calibration verification system is designed to achieve the on-line monitoring and verification goal for scheduling maintenance. Since an ANN is a data-driven model, it can learn the relationships among signals without prior knowledge of the physical model or process, which is usually difficult to establish for the complex non-linear systems. Furthermore, the ANNs provide a noise-reduced estimate of the signal measurement. More importantly, since a neural network learns the relationships among signals, it can give an unfaulted estimate of a faulty signal based on information provided by other unfaulted signals; that is, provide a correct estimate of a faulty signal. This ANN-based instrument verification system is capable of detecting small degradations or drifts occurring in instrumentation, and preclude false control actions or system damage caused by instrument degradation. In this dissertation, an automated scheme of neural network construction is developed. Previously, the neural network structure design required extensive knowledge of neural networks. An automated design methodology was developed so that a network structure can be created without expert interaction. This validation system was designed to monitor process sensors plant

  5. Realtime Automation Networks in moVing industrial Environments

    Directory of Open Access Journals (Sweden)

    Rafael Leidinger

    2012-04-01

    Full Text Available The radio-based wireless data communication has made the realization of new technical solutions possible in many fields of the automation technology (AT. For about ten years, a constant disproportionate growth of wireless technologies can be observed in the automation technology. However, it shows that especially for the AT, conven-tional technologies of office automation are unsuitable and/or not manageable. The employment of mobile ser-vices in the industrial automation technology has the potential of significant cost and time savings. This leads to an increased productivity in various fields of the AT, for example in the factory and process automation or in production logistics. In this paper technologies and solu-tions for an automation-suited supply of mobile wireless services will be introduced under the criteria of real time suitability, IT-security and service orientation. Emphasis will be put on the investigation and develop-ment of wireless convergence layers for different radio technologies, on the central provision of support services for an easy-to-use, central, backup enabled management of combined wired / wireless networks and on the study on integrability in a Profinet real-time Ethernet network [1].

  6. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  7. Feasibility Analysis of Crane Automation

    Institute of Scientific and Technical Information of China (English)

    DONG Ming-xiao; MEI Xue-song; JIANG Ge-dong; ZHANG Gui-qing

    2006-01-01

    This paper summarizes the modeling methods, open-loop control and closed-loop control techniques of various forms of cranes, worldwide, and discusses their feasibilities and limitations in engineering. Then the dynamic behaviors of cranes are analyzed. Finally, we propose applied modeling methods and feasible control techniques and demonstrate the feasibilities of crane automation.

  8. AUTOMATED DEFECT CLASSIFICATION USING AN ARTIFICIAL NEURAL NETWORK

    International Nuclear Information System (INIS)

    The automated defect classification algorithm based on artificial neural network with multilayer backpropagation structure was utilized. The selected features of flaws were used as input data. In order to train the neural network it is necessary to prepare learning data which is representative database of defects. Database preparation requires the following steps: image acquisition and pre-processing, image enhancement, defect detection and feature extraction. The real digital radiographs of welded parts of a ship were used for this purpose.

  9. Automated Defect Classification Using AN Artificial Neural Network

    Science.gov (United States)

    Chady, T.; Caryk, M.; Piekarczyk, B.

    2009-03-01

    The automated defect classification algorithm based on artificial neural network with multilayer backpropagation structure was utilized. The selected features of flaws were used as input data. In order to train the neural network it is necessary to prepare learning data which is representative database of defects. Database preparation requires the following steps: image acquisition and pre-processing, image enhancement, defect detection and feature extraction. The real digital radiographs of welded parts of a ship were used for this purpose.

  10. The contaminant analysis automation robot implementation for the automated laboratory

    International Nuclear Information System (INIS)

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation

  11. A framework for automated service composition in collaborative networks

    NARCIS (Netherlands)

    H. Afsarmanesh; M. Sargolzaei; M. Shadi

    2012-01-01

    This paper proposes a novel framework for automated software service composition that can significantly support and enhance collaboration among enterprises in service provision industry, such as in tourism insurance and e-commerce collaborative networks (CNs). Our proposed framework is founded on se

  12. Building Automation Networks for Smart Grids

    Directory of Open Access Journals (Sweden)

    Peizhong Yi

    2011-01-01

    Full Text Available Smart grid, as an intelligent power generation, distribution, and control system, needs various communication systems to meet its requirements. The ability to communicate seamlessly across multiple networks and domains is an open issue which is yet to be adequately addressed in smart grid architectures. In this paper, we present a framework for end-to-end interoperability in home and building area networks within smart grids. 6LoWPAN and the compact application protocol are utilized to facilitate the use of IPv6 and Zigbee application profiles such as Zigbee smart energy for network and application layer interoperability, respectively. A differential service medium access control scheme enables end-to-end connectivity between 802.15.4 and IP networks while providing quality of service guarantees for Zigbee traffic over Wi-Fi. We also address several issues including interference mitigation, load scheduling, and security and propose solutions to them.

  13. Building Automation Networks for Smart Grids

    OpenAIRE

    Peizhong Yi; Abiodun Iwayemi; Chi Zhou

    2011-01-01

    Smart grid, as an intelligent power generation, distribution, and control system, needs various communication systems to meet its requirements. The ability to communicate seamlessly across multiple networks and domains is an open issue which is yet to be adequately addressed in smart grid architectures. In this paper, we present a framework for end-to-end interoperability in home and building area networks within smart grids. 6LoWPAN and the compact application protocol are utilized to facili...

  14. Network Traffic Obfuscation and Automated Internet Censorship

    OpenAIRE

    Dixon, Lucas; Ristenpart, Thomas; Shrimpton, Thomas

    2016-01-01

    Internet censors seek ways to identify and block internet access to information they deem objectionable. Increasingly, censors deploy advanced networking tools such as deep-packet inspection (DPI) to identify such connections. In response, activists and academic researchers have developed and deployed network traffic obfuscation mechanisms. These apply specialized cryptographic tools to attempt to hide from DPI the true nature and content of connections. In this survey, we give an overview of...

  15. Automated analysis of 3D echocardiography

    NARCIS (Netherlands)

    Stralen, Marijn van

    2009-01-01

    In this thesis we aim at automating the analysis of 3D echocardiography, mainly targeting the functional analysis of the left ventricle. Manual analysis of these data is cumbersome, time-consuming and is associated with inter-observer and inter-institutional variability. Methods for reconstruction o

  16. Automated Negotiation for Resource Assignment in Wireless Surveillance Sensor Networks

    Directory of Open Access Journals (Sweden)

    Enrique de la Hoz

    2015-11-01

    Full Text Available Due to the low cost of CMOS IP-based cameras, wireless surveillance sensor networks have emerged as a new application of sensor networks able to monitor public or private areas or even country borders. Since these networks are bandwidth intensive and the radioelectric spectrum is limited, especially in unlicensed bands, it is mandatory to assign frequency channels in a smart manner. In this work, we propose the application of automated negotiation techniques for frequency assignment. Results show that these techniques are very suitable for the problem, being able to obtain the best solutions among the techniques with which we have compared them.

  17. NetBench. Automated Network Performance Testing

    CERN Document Server

    Cadeddu, Mattia

    2016-01-01

    In order to evaluate the operation of high performance routers, CERN has developed the NetBench software to run benchmarking tests by injecting various traffic patterns and observing the network devices behaviour in real-time. The tool features a modular design with a Python based console used to inject traffic and collect the results in a database, and a web user

  18. Automation of the Analysis of Moessbauer Spectra

    International Nuclear Information System (INIS)

    In the present report we propose the automation of least square fitting of Moessbauer spectra, the identification of the substance, its crystal structure and the access to the references with the help of a genetic algorith, Fuzzy logic, and the artificial neural network associated with a databank of Moessbauer parameters and references. This system could be useful for specialists and non-specialists, in industry as well as in research laboratories

  19. Automated diagnosis of rolling bearings using MRA and neural networks

    Science.gov (United States)

    Castejón, C.; Lara, O.; García-Prada, J. C.

    2010-01-01

    Any industry needs an efficient predictive plan in order to optimize the management of resources and improve the economy of the plant by reducing unnecessary costs and increasing the level of safety. A great percentage of breakdowns in productive processes are caused by bearings. They begin to deteriorate from early stages of their functional life, also called the incipient level. This manuscript develops an automated diagnosis of rolling bearings based on the analysis and classification of signature vibrations. The novelty of this work is the application of the methodology proposed for data collected from a quasi-real industrial machine, where rolling bearings support the radial and axial loads the bearings are designed for. Multiresolution analysis (MRA) is used in a first stage in order to extract the most interesting features from signals. Features will be used in a second stage as inputs of a supervised neural network (NN) for classification purposes. Experimental results carried out in a real system show the soundness of the method which detects four bearing conditions (normal, inner race fault, outer race fault and ball fault) in a very incipient stage.

  20. An automated inverse analysis system using neural networks and computational mechanics with its application to identification of 3-D crack shape hidden in solid

    International Nuclear Information System (INIS)

    This paper describes a new inverse analysis system using the hierarchical (multilayer) neural networks and the computational mechanics. The present inverse analysis basically consists of the following three subprocesses: by parametrically varying system parameters, their corresponding responses of the system are calculated through computational mechanics simulations, each of which is an ordinary direct analysis. Each data pair of system parameters system responses is called training pattern; the back-propagation neural network is iteratively trained using a number of training patterns. The system responses are given to the input units of the network, while the system parameters to be identified are given to its output units as teacher signal; some system responses measured are given to the well trained network, which immediately outputs appropriate system parameters even for untrained patterns. This is an inverse analysis. To demonstrate its practical performances, the present system is applied to identify locations and shapes of two adjacent dissimilar surface cracks hidden in a pipe with the electric potential drop method. The results clearly show that the present system is very efficient and accurate. (author). 7 refs., 10 figs

  1. Research of Network System Reconfigurable Model Based on the Finite State Automation

    Directory of Open Access Journals (Sweden)

    Shenghan Zhou

    2014-05-01

    Full Text Available Since the network analysis model based on the system state exists the issues of network survivability safety, fault tolerance and dynamic ability are adapted to the environment changes, in this paper, network system model based on the finite state automation has reconfigurable quality. The model first puts forward the concept of reconfigurable network systems and reveals its robustness, evolution and the basic attributes of survivability. By establishing a hierarchical model of system state, the system robust behavior, evolution behavior and survival behavior are described. Secondly, network topology reconfigurable measurement as an example, puts forward the quantitative reconfigurable metrics. At last, the example verification. Experiments show that the proposed reconfigurable quantitative indicators of reconfigurable resistance model for [1.391, 1.140, 1.591] prove that the network is an efficient reconfigurable network topology, which can effectively adapt the dynamic changes in the environment

  2. Evaluation of feature-based methods for automated network orientation

    OpenAIRE

    Apollonio, F I; Ballabeni, A.; M. Gaiani; F. Remondino

    2014-01-01

    Every day new tools and algorithms for automated image processing and 3D reconstruction purposes become available, giving the possibility to process large networks of unoriented and markerless images, delivering sparse 3D point clouds at reasonable processing time. In this paper we evaluate some feature-based methods used to automatically extract the tie points necessary for calibration and orientation procedures, in order to better understand their performances for 3D reconstruction...

  3. Using neural networks to assess flight deck human–automation interaction

    International Nuclear Information System (INIS)

    The increased complexity and interconnectivity of flight deck automation has made the prediction of human–automation interaction (HAI) difficult and has resulted in a number of accidents and incidents. There is a need to develop objective and robust methods by which the changes in HAI brought about by the introduction of new automation into the flight deck could be predicted and assessed prior to implementation and without use of extensive simulation. This paper presents a method to model a parametrization of flight deck automation known as HART and link it to HAI consequences using a backpropagation neural network approach. The transformation of the HART into a computational model suitable for modeling as a neural network is described. To test and train the network data were collected from 40 airline pilots for six HAI consequences based on one scenario family consisting of a baseline and four variants. For a binary classification of HAI consequences, the neural network successfully classified 62–78.5% depending on the consequence. The results were verified using a decision tree analysis

  4. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state-of...

  5. Anomaly detection in an automated safeguards system using neural networks

    International Nuclear Information System (INIS)

    An automated safeguards system must be able to detect an anomalous event, identify the nature of the event, and recommend a corrective action. Neural networks represent a new way of thinking about basic computational mechanisms for intelligent information processing. In this paper, we discuss the issues involved in applying a neural network model to the first step of this process: anomaly detection in materials accounting systems. We extend our previous model to a 3-tank problem and compare different neural network architectures and algorithms. We evaluate the computational difficulties in training neural networks and explore how certain design principles affect the problems. The issues involved in building a neural network architecture include how the information flows, how the network is trained, how the neurons in a network are connected, how the neurons process information, and how the connections between neurons are modified. Our approach is based on the demonstrated ability of neural networks to model complex, nonlinear, real-time processes. By modeling the normal behavior of the processes, we can predict how a system should be behaving and, therefore, detect when an abnormality occurs

  6. Ecological network analysis: network construction

    NARCIS (Netherlands)

    Fath, B.D.; Scharler, U.M.; Ulanowicz, R.E.; Hannon, B.

    2007-01-01

    Ecological network analysis (ENA) is a systems-oriented methodology to analyze within system interactions used to identify holistic properties that are otherwise not evident from the direct observations. Like any analysis technique, the accuracy of the results is as good as the data available, but t

  7. Automated Functional Analysis in Dynamic Medical Imaging

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej

    Praha : Katedra matematiky, FSv ČVUT v Praze, 2012, s. 19-20. [Aplikovaná matematika – Rektorysova soutěž. Praha (CZ), 07.12.2012] Institutional support: RVO:67985556 Keywords : Factor Analysis * Dynamic Sequence * Scintigraphy Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2012/AS/tichy-automated functional analysis in dynamic medical imaging.pdf

  8. Statistical modelling of networked human-automation performance using working memory capacity.

    Science.gov (United States)

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models. PMID:24308716

  9. Automated Modeling of Microwave Structures by Enhanced Neural Networks

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2006-12-01

    Full Text Available The paper describes the methodology of the automated creation of neural models of microwave structures. During the creation process, artificial neural networks are trained using the combination of the particle swarm optimization and the quasi-Newton method to avoid critical training problems of the conventional neural nets. In the paper, neural networks are used to approximate the behavior of a planar microwave filter (moment method, Zeland IE3D. In order to evaluate the efficiency of neural modeling, global optimizations are performed using numerical models and neural ones. Both approaches are compared from the viewpoint of CPU-time demands and the accuracy. Considering conclusions, methodological recommendations for including neural networks to the microwave design are formulated.

  10. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  11. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  12. Survey on Wireless Sensor Network Technologies for Industrial Automation: The Security and Quality of Service Perspectives

    Directory of Open Access Journals (Sweden)

    Delphine Christin

    2010-04-01

    Full Text Available Wireless Sensor Networks (WSNs are gradually adopted in the industrial world due to their advantages over wired networks. In addition to saving cabling costs, WSNs widen the realm of environments feasible for monitoring. They thus add sensing and acting capabilities to objects in the physical world and allow for communication among these objects or with services in the future Internet. However, the acceptance of WSNs by the industrial automation community is impeded by open issues, such as security guarantees and provision of Quality of Service (QoS. To examine both of these perspectives, we select and survey relevant WSN technologies dedicated to industrial automation. We determine QoS requirements and carry out a threat analysis, which act as basis of our evaluation of the current state-of-the-art. According to the results of this evaluation, we identify and discuss open research issues.

  13. Automated refinement and inference of analytical models for metabolic networks

    Science.gov (United States)

    Schmidt, Michael D.; Vallabhajosyula, Ravishankar R.; Jenkins, Jerry W.; Hood, Jonathan E.; Soni, Abhishek S.; Wikswo, John P.; Lipson, Hod

    2011-10-01

    The reverse engineering of metabolic networks from experimental data is traditionally a labor-intensive task requiring a priori systems knowledge. Using a proven model as a test system, we demonstrate an automated method to simplify this process by modifying an existing or related model--suggesting nonlinear terms and structural modifications--or even constructing a new model that agrees with the system's time series observations. In certain cases, this method can identify the full dynamical model from scratch without prior knowledge or structural assumptions. The algorithm selects between multiple candidate models by designing experiments to make their predictions disagree. We performed computational experiments to analyze a nonlinear seven-dimensional model of yeast glycolytic oscillations. This approach corrected mistakes reliably in both approximated and overspecified models. The method performed well to high levels of noise for most states, could identify the correct model de novo, and make better predictions than ordinary parametric regression and neural network models. We identified an invariant quantity in the model, which accurately derived kinetics and the numerical sensitivity coefficients of the system. Finally, we compared the system to dynamic flux estimation and discussed the scaling and application of this methodology to automated experiment design and control in biological systems in real time.

  14. Automated refinement and inference of analytical models for metabolic networks

    International Nuclear Information System (INIS)

    The reverse engineering of metabolic networks from experimental data is traditionally a labor-intensive task requiring a priori systems knowledge. Using a proven model as a test system, we demonstrate an automated method to simplify this process by modifying an existing or related model-–suggesting nonlinear terms and structural modifications–-or even constructing a new model that agrees with the system's time series observations. In certain cases, this method can identify the full dynamical model from scratch without prior knowledge or structural assumptions. The algorithm selects between multiple candidate models by designing experiments to make their predictions disagree. We performed computational experiments to analyze a nonlinear seven-dimensional model of yeast glycolytic oscillations. This approach corrected mistakes reliably in both approximated and overspecified models. The method performed well to high levels of noise for most states, could identify the correct model de novo, and make better predictions than ordinary parametric regression and neural network models. We identified an invariant quantity in the model, which accurately derived kinetics and the numerical sensitivity coefficients of the system. Finally, we compared the system to dynamic flux estimation and discussed the scaling and application of this methodology to automated experiment design and control in biological systems in real time

  15. An Automated System for Discovering Neighborhood Patterns in Ego Networks

    OpenAIRE

    Muhammad, Syed Agha; Van Laerhoven, Kristof

    2015-01-01

    Generally, social network analysis has often focused on the topology of the network without considering the characteristics of individuals involved in them. Less attention is given to study the behavior of individuals, considering they are the basic entity of a graph. Given a mobile social network graph, what are good features to extract key information from the nodes?How many distinct neighborhood patterns exist for ego nodes? What clues does such information provide to study nodes over a lo...

  16. Automating Trend Analysis for Spacecraft Constellations

    Science.gov (United States)

    Davis, George; Cooter, Miranda; Updike, Clark; Carey, Everett; Mackey, Jennifer; Rykowski, Timothy; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Spacecraft trend analysis is a vital mission operations function performed by satellite controllers and engineers, who perform detailed analyses of engineering telemetry data to diagnose subsystem faults and to detect trends that may potentially lead to degraded subsystem performance or failure in the future. It is this latter function that is of greatest importance, for careful trending can often predict or detect events that may lead to a spacecraft's entry into safe-hold. Early prediction and detection of such events could result in the avoidance of, or rapid return to service from, spacecraft safing, which not only results in reduced recovery costs but also in a higher overall level of service for the satellite system. Contemporary spacecraft trending activities are manually intensive and are primarily performed diagnostically after a fault occurs, rather than proactively to predict its occurrence. They also tend to rely on information systems and software that are oudated when compared to current technologies. When coupled with the fact that flight operations teams often have limited resources, proactive trending opportunities are limited, and detailed trend analysis is often reserved for critical responses to safe holds or other on-orbit events such as maneuvers. While the contemporary trend analysis approach has sufficed for current single-spacecraft operations, it will be unfeasible for NASA's planned and proposed space science constellations. Missions such as the Dynamics, Reconnection and Configuration Observatory (DRACO), for example, are planning to launch as many as 100 'nanospacecraft' to form a homogenous constellation. A simple extrapolation of resources and manpower based on single-spacecraft operations suggests that trending for such a large spacecraft fleet will be unmanageable, unwieldy, and cost-prohibitive. It is therefore imperative that an approach to automating the spacecraft trend analysis function be studied, developed, and applied to

  17. Automated Radiochemical Separation, Analysis, and Sensing

    International Nuclear Information System (INIS)

    Chapter 14 for the 2nd edition of the Handbook of Radioactivity Analysis. The techniques and examples described in this chapter demonstrate that modern fluidic techniques and instrumentation can be used to develop automated radiochemical separation workstations. In many applications, these can be mechanically simple and key parameters can be controlled from software. If desired, many of the fluidic components and solution can be located remotely from the radioactive samples and other hot sample processing zones. There are many issues to address in developing automated radiochemical separation that perform reliably time after time in unattended operation. These are associated primarily with the separation and analytical chemistry aspects of the process. The relevant issues include the selectivity of the separation, decontamination factors, matrix effects, and recoveries from the separation column. In addition, flow rate effects, column lifetimes, carryover from one sample to another, and sample throughput must be considered. Nevertheless, successful approaches for addressing these issues have been developed. Radiochemical analysis is required not only for processing nuclear waste samples in the laboratory, but also for at-site or in situ applications. Monitors for nuclear waste processing operations represent an at-site application where continuous unattended monitoring is required to assure effective process radiochemical separations that produce waste streams that qualify for conversion to stable waste forms. Radionuclide sensors for water monitoring and long term stewardship represent an application where at-site or in situ measurements will be most effective. Automated radiochemical analyzers and sensors have been developed that demonstrate that radiochemical analysis beyond the analytical laboratory is both possible and practical

  18. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  19. APSAS; an Automated Particle Size Analysis System

    Science.gov (United States)

    Poppe, Lawrence J.; Eliason, A.H.; Fredericks, J.J.

    1985-01-01

    The Automated Particle Size Analysis System integrates a settling tube and an electroresistance multichannel particle-size analyzer (Coulter Counter) with a Pro-Comp/gg microcomputer and a Hewlett Packard 2100 MX(HP 2100 MX) minicomputer. This system and its associated software digitize the raw sediment grain-size data, combine the coarse- and fine-fraction data into complete grain-size distributions, perform method of moments and inclusive graphics statistics, verbally classify the sediment, generate histogram and cumulative frequency plots, and transfer the results into a data-retrieval system. This system saves time and labor and affords greater reliability, resolution, and reproducibility than conventional methods do.

  20. Automated Analysis, Classification, and Display of Waveforms

    Science.gov (United States)

    Kwan, Chiman; Xu, Roger; Mayhew, David; Zhang, Frank; Zide, Alan; Bonggren, Jeff

    2004-01-01

    A computer program partly automates the analysis, classification, and display of waveforms represented by digital samples. In the original application for which the program was developed, the raw waveform data to be analyzed by the program are acquired from space-shuttle auxiliary power units (APUs) at a sampling rate of 100 Hz. The program could also be modified for application to other waveforms -- for example, electrocardiograms. The program begins by performing principal-component analysis (PCA) of 50 normal-mode APU waveforms. Each waveform is segmented. A covariance matrix is formed by use of the segmented waveforms. Three eigenvectors corresponding to three principal components are calculated. To generate features, each waveform is then projected onto the eigenvectors. These features are displayed on a three-dimensional diagram, facilitating the visualization of the trend of APU operations.

  1. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    OpenAIRE

    August Betzler; Carles Gomez; Ilker Demirkol; Josep Paradells

    2014-01-01

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible perfo...

  2. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  3. ASteCA - Automated Stellar Cluster Analysis

    CERN Document Server

    Perren, Gabriel I; Piatti, Andrés E

    2014-01-01

    We present ASteCA (Automated Stellar Cluster Analysis), a suit of tools designed to fully automatize the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its unce...

  4. Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks

    Institute of Scientific and Technical Information of China (English)

    Ling Bai; Ping Guo; Zhan-Yi Hu

    2005-01-01

    An automated classification technique for large size stellar surveys is proposed. It uses the extended Kalman filter as a feature selector and pre-classifier of the data, and the radial basis function neural networks for the classification.Experiments with real data have shown that the correct classification rate can reach as high as 93%, which is quite satisfactory. When different system models are selected for the extended Kalman filter, the classification results are relatively stable. It is shown that for this particular case the result using extended Kalman filter is better than using principal component analysis.

  5. PERFORMANCE ANALYSIS OF FIELD LAYER COMMUNICATING NETWORK BASED ON POLLING PROTOCOL IN SUBSTATION AUTOMATION SYSTEM%基于polling方式的变电站现场级通信网络性能

    Institute of Scientific and Technical Information of China (English)

    王峥; 胡敏强; 郑建勇

    2001-01-01

    This article firstly presents the basic analysis of the data generated from field measuring and controlling units (FMCU), using the simple method of probability theory and considering the particularity of middle or small integrated substation automation systems. Then detailed discussions on each segment of the program running in the FMCU are given. The limitation on CPU resources of FMCUs is studied, which is closely related with polling cycle. After each part of the polling cycle is determined, the expressions about the steady performances of field layer communicating network such as channel efficiency or average delay are given, on which the optimum number of FMCUs in such network is obtained.%针对中小型变电站自动化系统中通信网络的特点,首先采用概率统计的基本方法分析了现场测控单元(简称为FMCU)数据产生的规律,然后对FMCU内部的各个程序段进行了较为详细的讨论,确定了FMCU的CPU资源对轮询(polling)周期的约束条件。在逐一分析了轮询周期的组成部分之后,给出了在上述约束条件下的现场级通信网络稳态性能指标,即信道利用率、数据平均发送延时与轮询周期及FMCU数目之间相互关系的表达式。在此基础上提出了确定通信网络最佳FMCU配置数目的方法。

  6. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  7. An approach to automated chromosome analysis

    International Nuclear Information System (INIS)

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author)

  8. Ecological Automation Design, Extending Work Domain Analysis

    NARCIS (Netherlands)

    Amelink, M.H.J.

    2010-01-01

    In high–risk domains like aviation, medicine and nuclear power plant control, automation has enabled new capabilities, increased the economy of operation and has greatly contributed to safety. However, automation increases the number of couplings in a system, which can inadvertently lead to more com

  9. Automation literature: A brief review and analysis

    Science.gov (United States)

    Smith, D.; Dieterly, D. L.

    1980-01-01

    Current thought and research positions which may allow for an improved capability to understand the impact of introducing automation to an existing system are established. The orientation was toward the type of studies which may provide some general insight into automation; specifically, the impact of automation in human performance and the resulting system performance. While an extensive number of articles were reviewed, only those that addressed the issue of automation and human performance were selected to be discussed. The literature is organized along two dimensions: time, Pre-1970, Post-1970; and type of approach, Engineering or Behavioral Science. The conclusions reached are not definitive, but do provide the initial stepping stones in an attempt to begin to bridge the concept of automation in a systematic progression.

  10. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  11. Automated analysis and annotation of basketball video

    Science.gov (United States)

    Saur, Drew D.; Tan, Yap-Peng; Kulkarni, Sanjeev R.; Ramadge, Peter J.

    1997-01-01

    Automated analysis and annotation of video sequences are important for digital video libraries, content-based video browsing and data mining projects. A successful video annotation system should provide users with useful video content summary in a reasonable processing time. Given the wide variety of video genres available today, automatically extracting meaningful video content for annotation still remains hard by using current available techniques. However, a wide range video has inherent structure such that some prior knowledge about the video content can be exploited to improve our understanding of the high-level video semantic content. In this paper, we develop tools and techniques for analyzing structured video by using the low-level information available directly from MPEG compressed video. Being able to work directly in the video compressed domain can greatly reduce the processing time and enhance storage efficiency. As a testbed, we have developed a basketball annotation system which combines the low-level information extracted from MPEG stream with the prior knowledge of basketball video structure to provide high level content analysis, annotation and browsing for events such as wide- angle and close-up views, fast breaks, steals, potential shots, number of possessions and possession times. We expect our approach can also be extended to structured video in other domains.

  12. Automated segmentation and classification of multispectral magnetic resonance images of brain using artificial neural networks.

    Science.gov (United States)

    Reddick, W E; Glass, J O; Cook, E N; Elkin, T D; Deaton, R J

    1997-12-01

    We present a fully automated process for segmentation and classification of multispectral magnetic resonance (MR) images. This hybrid neural network method uses a Kohonen self-organizing neural network for segmentation and a multilayer backpropagation neural network for classification. To separate different tissue types, this process uses the standard T1-, T2-, and PD-weighted MR images acquired in clinical examinations. Volumetric measurements of brain structures, relative to intracranial volume, were calculated for an index transverse section in 14 normal subjects (median age 25 years; seven male, seven female). This index slice was at the level of the basal ganglia, included both genu and splenium of the corpus callosum, and generally, showed the putamen and lateral ventricle. An intraclass correlation of this automated segmentation and classification of tissues with the accepted standard of radiologist identification for the index slice in the 14 volunteers demonstrated coefficients (ri) of 0.91, 0.95, and 0.98 for white matter, gray matter, and ventricular cerebrospinal fluid (CSF), respectively. An analysis of variance for estimates of brain parenchyma volumes in five volunteers imaged five times each demonstrated high intrasubject reproducibility with a significance of at least p < 0.05 for white matter, gray matter, and white/gray partial volumes. The population variation, across 14 volunteers, demonstrated little deviation from the averages for gray and white matter, while partial volume classes exhibited a slightly higher degree of variability. This fully automated technique produces reliable and reproducible MR image segmentation and classification while eliminating intra- and interobserver variability. PMID:9533591

  13. Domotics – A Cost Effective Smart Home Automation System Using Wifi as Network Infrastructure

    OpenAIRE

    Abhinav Talgeri; Abheesh Kumar B A

    2014-01-01

    This paper describes an investigation into the potential for remote controlled operation of home automation (also called as Domotics) systems. It considers problems with their implementation, discusses possible solutions through various network technologies and indicates how to optimize the use of such systems. This paper emphasizes on the design and prototype implementation of new home automation system that uses WiFi technology as a network infrastructure connecting its part...

  14. Automated Detection of Soma Location and Morphology in Neuronal Network Cultures

    OpenAIRE

    Burcin Ozcan; Pooran Negi; Fernanda Laezza; Manos Papadakis; Demetrio Labate

    2015-01-01

    Automated identification of the primary components of a neuron and extraction of its sub-cellular features are essential steps in many quantitative studies of neuronal networks. The focus of this paper is the development of an algorithm for the automated detection of the location and morphology of somas in confocal images of neuronal network cultures. This problem is motivated by applications in high-content screenings (HCS), where the extraction of multiple morphological features of neurons ...

  15. Automated quantitative image analysis of nanoparticle assembly

    Science.gov (United States)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated

  16. Request-Driven Schedule Automation for the Deep Space Network

    Science.gov (United States)

    Johnston, Mark D.; Tran, Daniel; Arroyo, Belinda; Call, Jared; Mercado, Marisol

    2010-01-01

    The DSN Scheduling Engine (DSE) has been developed to increase the level of automated scheduling support available to users of NASA s Deep Space Network (DSN). We have adopted a request-driven approach to DSN scheduling, in contrast to the activity-oriented approach used up to now. Scheduling requests allow users to declaratively specify patterns and conditions on their DSN service allocations, including timing, resource requirements, gaps, overlaps, time linkages among services, repetition, priorities, and a wide range of additional factors and preferences. The DSE incorporates a model of the key constraints and preferences of the DSN scheduling domain, along with algorithms to expand scheduling requests into valid resource allocations, to resolve schedule conflicts, and to repair unsatisfied requests. We use time-bounded systematic search with constraint relaxation to return nearby solutions if exact ones cannot be found, where the relaxation options and order are under user control. To explore the usability aspects of our approach we have developed a graphical user interface incorporating some crucial features to make it easier to work with complex scheduling requests. Among these are: progressive revelation of relevant detail, immediate propagation and visual feedback from a user s decisions, and a meeting calendar metaphor for repeated patterns of requests. Even as a prototype, the DSE has been deployed and adopted as the initial step in building the operational DSN schedule, thus representing an important initial validation of our overall approach. The DSE is a core element of the DSN Service Scheduling Software (S(sup 3)), a web-based collaborative scheduling system now under development for deployment to all DSN users.

  17. An Automated Capacitance-Based Fuel Level Monitoring System for Networked Tanks

    Directory of Open Access Journals (Sweden)

    Oke Alice O

    2015-08-01

    Full Text Available The making of an effective fuel measuring system has been a great challenge in the Nigerian industry, as various oil organization are running into different problems ranging from fire outbreak, oil pilfering, oil spillage and some other negative effects. The use of meter rule or long rod at most petrol filling stations for quantity assessment of fuel in tank is inefficient, stressful, dangerous and almost impossible in a networking environment. This archaic method does not provide good reorder date and does not give a good inventory. As such there is a need to automate the system by providing a real time measurement of fuel storage device to meet the demand of the customers. In this paper, a system was designed to sense the level of fuel in a networked tanks using a capacitive sensor controlled by an ATMEGA 328 Arduino microcontroller. The result was automated both in digital and analogue form through radio frequency Transmission using XBee and interfaced to Computer System for notification of fuel level and refill operations. This enables consumption control, cost analysis and tax accounting for fuel purchases

  18. Automated Measurement and Signaling Systems for the Transactional Network

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Brown, Richard; Price, Phillip; Page, Janie; Granderson, Jessica; Riess, David; Czarnecki, Stephen; Ghatikar, Girish; Lanzisera, Steven

    2013-12-31

    The Transactional Network Project is a multi-lab activity funded by the US Department of Energy?s Building Technologies Office. The project team included staff from Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory and Oak Ridge National Laboratory. The team designed, prototyped and tested a transactional network (TN) platform to support energy, operational and financial transactions between any networked entities (equipment, organizations, buildings, grid, etc.). PNNL was responsible for the development of the TN platform, with agents for this platform developed by each of the three labs. LBNL contributed applications to measure the whole-building electric load response to various changes in building operations, particularly energy efficiency improvements and demand response events. We also provide a demand response signaling agent and an agent for cost savings analysis. LBNL and PNNL demonstrated actual transactions between packaged rooftop units and the electric grid using the platform and selected agents. This document describes the agents and applications developed by the LBNL team, and associated tests of the applications.

  19. Area γ radiation monitoring network systems based on totally integrated automation

    International Nuclear Information System (INIS)

    It introduces a kind of Area γ Radiation Monitoring Network Systems based on Totally Integrated Automation. It features simple and safe process control, easy integration of information network, field bus and field instrumentation, modular design and powerful system expansion, implements management and control integration, is positive importance for localization of Radiation Monitoring System. (authors)

  20. 32 CFR 2001.50 - Telecommunications automated information systems and network security.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Telecommunications automated information systems and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National... network security. Each agency head shall ensure that classified information electronically...

  1. Distributed microprocessor automation network for synthesizing radiotracers used in positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Russell, J.A.G.; Alexoff, D.L.; Wolf, A.P.

    1984-09-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. 20 refs. (DT)

  2. Distributed Microprocessor Automation Network for Synthesizing Radiotracers Used in Positron Emission Tomography [PET

    Science.gov (United States)

    Russell, J. A. G.; Alexoff, D. L.; Wolf, A. P.

    1984-09-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. (DT)

  3. Distributed microprocessor automation network for synthesizing radiotracers used in positron emission tomography

    International Nuclear Information System (INIS)

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. 20 refs. (DT)

  4. Automated quantification and analysis of mandibular asymmetry

    DEFF Research Database (Denmark)

    Darvann, T. A.; Hermann, N. V.; Larsen, P.; Ólafsdóttir, Hildur; Hansen, I. V.; Hove, H. D.; Christensen, L.; Rueckert, D.; Kreiborg, S.

    We present an automated method of spatially detailed 3D asymmetry quantification in mandibles extracted from CT and apply it to a population of infants with unilateral coronal synostosis (UCS). An atlas-based method employing non-rigid registration of surfaces is used for determining deformation...

  5. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  6. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  7. Neural-network-based state of health diagnostics for an automated radioxenon sampler/analyzer

    Science.gov (United States)

    Keller, Paul E.; Kangas, Lars J.; Hayes, James C.; Schrom, Brian T.; Suarez, Reynold; Hubbard, Charles W.; Heimbigner, Tom R.; McIntyre, Justin I.

    2009-05-01

    Artificial neural networks (ANNs) are used to determine the state-of-health (SOH) of the Automated Radioxenon Analyzer/Sampler (ARSA). ARSA is a gas collection and analysis system used for non-proliferation monitoring in detecting radioxenon released during nuclear tests. SOH diagnostics are important for automated, unmanned sensing systems so that remote detection and identification of problems can be made without onsite staff. Both recurrent and feed-forward ANNs are presented. The recurrent ANN is trained to predict sensor values based on current valve states, which control air flow, so that with only valve states the normal SOH sensor values can be predicted. Deviation between modeled value and actual is an indication of a potential problem. The feed-forward ANN acts as a nonlinear version of principal components analysis (PCA) and is trained to replicate the normal SOH sensor values. Because of ARSA's complexity, this nonlinear PCA is better able to capture the relationships among the sensors than standard linear PCA and is applicable to both sensor validation and recognizing off-normal operating conditions. Both models provide valuable information to detect impending malfunctions before they occur to avoid unscheduled shutdown. Finally, the ability of ANN methods to predict the system state is presented.

  8. Neural Network Based State of Health Diagnostics for an Automated Radioxenon Sampler/Analyzer

    International Nuclear Information System (INIS)

    Artificial neural networks (ANNs) are used to determine the state-of-health (SOH) of the Automated Radioxenon Analyzer/Sampler (ARSA). ARSA is a gas collection and analysis system used for non-proliferation monitoring in detecting radioxenon released during nuclear tests. SOH diagnostics are important for automated, unmanned sensing systems so that remote detection and identification of problems can be made without onsite staff. Both recurrent and feed-forward ANNs are presented. The recurrent ANN is trained to predict sensor values based on current valve states, which control air flow, so that with only valve states the normal SOH sensor values can be predicted. Deviation between modeled value and actual is an indication of a potential problem. The feed-forward ANN acts as a nonlinear version of principal components analysis (PCA) and is trained to replicate the normal SOH sensor values. Because of ARSA's complexity, this nonlinear PCA is better able to capture the relationships among the sensors than standard linear PCA and is applicable to both sensor validation and recognizing off-normal operating conditions. Both models provide valuable information to detect impending malfunctions before they occur to avoid unscheduled shutdown. Finally, the ability of ANN methods to predict the system state is presented

  9. Neural Network Based State of Health Diagnostics for an Automated Radioxenon Sampler/Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Paul E.; Kangas, Lars J.; Hayes, James C.; Schrom, Brian T.; Suarez, Reynold; Hubbard, Charles W.; Heimbigner, Tom R.; McIntyre, Justin I.

    2009-05-13

    Artificial neural networks (ANNs) are used to determine the state-of-health (SOH) of the Automated Radioxenon Analyzer/Sampler (ARSA). ARSA is a gas collection and analysis system used for non-proliferation monitoring in detecting radioxenon released during nuclear tests. SOH diagnostics are important for automated, unmanned sensing systems so that remote detection and identification of problems can be made without onsite staff. Both recurrent and feed-forward ANNs are presented. The recurrent ANN is trained to predict sensor values based on current valve states, which control air flow, so that with only valve states the normal SOH sensor values can be predicted. Deviation between modeled value and actual is an indication of a potential problem. The feed-forward ANN acts as a nonlinear version of principal components analysis (PCA) and is trained to replicate the normal SOH sensor values. Because of ARSA’s complexity, this nonlinear PCA is better able to capture the relationships among the sensors than standard linear PCA and is applicable to both sensor validation and recognizing off-normal operating conditions. Both models provide valuable information to detect impending malfunctions before they occur to avoid unscheduled shutdown. Finally, the ability of ANN methods to predict the system state is presented.

  10. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  11. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  12. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  13. Automated detection of soma location and morphology in neuronal network cultures.

    Directory of Open Access Journals (Sweden)

    Burcin Ozcan

    Full Text Available Automated identification of the primary components of a neuron and extraction of its sub-cellular features are essential steps in many quantitative studies of neuronal networks. The focus of this paper is the development of an algorithm for the automated detection of the location and morphology of somas in confocal images of neuronal network cultures. This problem is motivated by applications in high-content screenings (HCS, where the extraction of multiple morphological features of neurons on large data sets is required. Existing algorithms are not very efficient when applied to the analysis of confocal image stacks of neuronal cultures. In addition to the usual difficulties associated with the processing of fluorescent images, these types of stacks contain a small number of images so that only a small number of pixels are available along the z-direction and it is challenging to apply conventional 3D filters. The algorithm we present in this paper applies a number of innovative ideas from the theory of directional multiscale representations and involves the following steps: (i image segmentation based on support vector machines with specially designed multiscale filters; (ii soma extraction and separation of contiguous somas, using a combination of level set method and directional multiscale filters. We also present an approach to extract the soma's surface morphology using the 3D shearlet transform. Extensive numerical experiments show that our algorithms are computationally efficient and highly accurate in segmenting the somas and separating contiguous ones. The algorithms presented in this paper will facilitate the development of a high-throughput quantitative platform for the study of neuronal networks for HCS applications.

  14. Network systems security analysis

    Science.gov (United States)

    Yilmaz, Ä.°smail

    2015-05-01

    Network Systems Security Analysis has utmost importance in today's world. Many companies, like banks which give priority to data management, test their own data security systems with "Penetration Tests" by time to time. In this context, companies must also test their own network/server systems and take precautions, as the data security draws attention. Based on this idea, the study cyber-attacks are researched throughoutly and Penetration Test technics are examined. With these information on, classification is made for the cyber-attacks and later network systems' security is tested systematically. After the testing period, all data is reported and filed for future reference. Consequently, it is found out that human beings are the weakest circle of the chain and simple mistakes may unintentionally cause huge problems. Thus, it is clear that some precautions must be taken to avoid such threats like updating the security software.

  15. Social Network Analysis Based on Network Motifs

    OpenAIRE

    Xu Hong-lin; Yan Han-bing; Gao Cui-fang; Zhu Ping

    2014-01-01

    Based on the community structure characteristics, theory, and methods of frequent subgraph mining, network motifs findings are firstly introduced into social network analysis; the tendentiousness evaluation function and the importance evaluation function are proposed for effectiveness assessment. Compared with the traditional way based on nodes centrality degree, the new approach can be used to analyze the properties of social network more fully and judge the roles of the nodes effectively. I...

  16. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    OpenAIRE

    JangMook Kang; Woojin Lee; Juil Kim

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications us...

  17. Automated Analysis of Source Code Patches using Machine Learning Algorithms

    OpenAIRE

    Castro Lechtaler, Antonio; Liporace, Julio César; Cipriano, Marcelo; García, Edith; Maiorano, Ariel; Malvacio, Eduardo; Tapia, Néstor

    2015-01-01

    An updated version of a tool for automated analysis of source code patches and branch differences is presented. The upgrade involves the use of machine learning techniques on source code, comments, and messages. It aims to help analysts, code reviewers, or auditors perform repetitive tasks continuously. The environment designed encourages collaborative work. It systematizes certain tasks pertaining to reviewing or auditing processes. Currently, the scope of the automated test is limited. C...

  18. AN AUTOMATED TESTING NETWORK SYSTEM FOR PAPER LABORATORY BASED ON CAN BUS

    Institute of Scientific and Technical Information of China (English)

    Xianhui Yi; Dongbo Yan; Huanbin Liu; Jigeng Li

    2004-01-01

    This paper presents an automated testing network system for paper laboratory based on CAN bus. The overall architecture, hardware interface and software function are discussed in detail. It is indicated through experiment that the system can collect,analyze and store the test results from the various measuring instruments in the paper lab automatically.The simple, reliable, low-cost measuring automation system will have a prosperous application in the future paper industry.

  19. Feed forward neural networks and genetic algorithms for automated financial time series modelling

    OpenAIRE

    Kingdon, J. C.

    1995-01-01

    This thesis presents an automated system for financial time series modelling. Formal and applied methods are investigated for combining feed-forward Neural Networks and Genetic Algorithms (GAs) into a single adaptive/learning system for automated time series forecasting. Four important research contributions arise from this investigation: i) novel forms of GAs are introduced which are designed to counter the representational bias associated with the conventional Holland GA, ii) an...

  20. AN AUTOMATED NETWORK SECURITYCHECKING AND ALERT SYSTEM: A NEW FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Vivek Kumar Yadav

    2013-09-01

    Full Text Available Network security checking is a vital process to assess and to identify weaknesses in network for management of security. Insecure entry points of a network provide attackers an easy target to access and compromise. Open ports of network components such as firewalls, gateways and end systems are analogues to open gates of a building through which any one can get into. Network scanning is performed to identify insecure entry points in the network components. To find out vulnerabilities on these points vulnerability assessment is performed. So security checking consists of both activities- network scanning as well as vulnerability assessment. A single tool used for the security checking may not give reliable results. This paper presents a framework for assessing the security of a network using multiple Network Scanning and Vulnerability Assessment tools. The proposed framework is an extension of the framework given by Jun Yoon and Wontae Sim [1] which performs vulnerability scanning only. The framework presented here adds network scanning, alerting and reporting system to their framework. Network scanning and vulnerability tools together complement each other and make it amenable for centralized control and management. The reporting system of framework sends an email to the network administrator which contains detailed report (as attachment of security checking process. Alerting system sends a SMS message as an alert to the network administrator in case of severe threats found in the network. Initial results of the framework are encouraging and further work is in progress.

  1. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  2. Semantic analysis for system level design automation

    OpenAIRE

    Greenwood, Rob

    1992-01-01

    This thesis describes the design and implementation of a system to extract meaning from natural language specifications of digital systems. This research is part of the ASPIN project which has the long-term goal of providing an automated system for digital system synthesis from informal specifications. This work makes several contributions, one being the application of artificial intelligence techniques to specifications writing. Also, the work deals with the subset of the Engl...

  3. Policy-Based Automation of Dynamique and Multipoint Virtual Private Network Simulation on OPNET Modeler

    Directory of Open Access Journals (Sweden)

    Ayoub BAHNASSE

    2014-12-01

    Full Text Available The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different architectures of the Dynamic Multipoint Virtual Private Network. Several research studies have been conducted to automate the generation and simulation of complex networks under various simulators, according to our research no work has dealt with the Dynamic Multipoint Virtual Private Network. In this paper we present a simulation model of the Dynamic and Multipoint Virtual Private network in OPNET Modeler, and a WEB-based tool for project management on the same network.

  4. DESIGN AND IMPLEMENTATION FOR AUTOMATED NETWORK TROUBLESHOOTING USING DATA MINING

    OpenAIRE

    Eleni Rozaki

    2015-01-01

    The efficient and effective monitoring of mobile networks is vital given the number of users who rely on such networks and the importance of those networks. The purpose of this paper is to present a monitoring scheme for mobile networks based on the use of rules and decision tree data mining classifiers to upgrade fault detection and handling. The goal is to have optimisation rules that improve anomaly detection. In addition, a monitoring scheme that relies on Bayesian classifiers...

  5. Design and implementation for automated network troubleshooting using data mining

    OpenAIRE

    Rozaki, Eleni

    2015-01-01

    The efficient and effective monitoring of mobile networks is vital given the number of users who rely on such networks and the importance of those networks. The purpose of this paper is to present a monitoring scheme for mobile networks based on the use of rules and decision tree data mining classifiers to upgrade fault detection and handling. The goal is to have optimisation rules that improve anomaly detection. In addition, a monitoring scheme that relies on Bayesian classifiers was also im...

  6. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  7. Automated Analysis of Fluorescence Microscopy Images to Identify Protein-Protein Interactions

    OpenAIRE

    Morrell-Falvey, J. L.; Qi, H.; Doktycz, M. J.; Venkatraman, S.

    2006-01-01

    The identification of protein interactions is important for elucidating biological networks. One obstacle in comprehensive interaction studies is the analyses of large datasets, particularly those containing images. Development of an automated system to analyze an image-based protein interaction dataset is needed. Such an analysis system is described here, to automatically extract features from fluorescence microscopy images obtained from a bacterial protein interaction assay. These features ...

  8. Obtaining informedness in collaborative networks through automated information provisioning

    DEFF Research Database (Denmark)

    Thimm, Heiko; Rasmussen, Karsten Boye

    2013-01-01

    Successful collaboration in business networks calls for well-informed network participants. Members who know about the many aspects of the network are an effective vehicle to successfully resolve conflicts, build a prospering collaboration climate and promote trust within the network....... The importance of well-informed network participants has led to the concept of network participant informedness which is derived from existing theories and concepts for firm informedness. It is possible to support and develop well-informed network participants through a specialised IT-based active information...... provisioning service. This article presents a corresponding modelling framework and a rule-based approach for the active system capabilities required. Details of a prototype implementation building on concepts of the research area of active databases are also reported....

  9. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  10. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation. The prese...

  11. Image analysis and platform development for automated phenotyping in cytomics

    NARCIS (Netherlands)

    Yan, Kuan

    2013-01-01

    This thesis is dedicated to the empirical study of image analysis in HT/HC screen study. Often a HT/HC screening produces extensive amounts that cannot be manually analyzed. Thus, an automated image analysis solution is prior to an objective understanding of the raw image data. Compared to general a

  12. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  13. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  14. DESIGN AND IMPLEMENTATION FOR AUTOMATED NETWORK TROUBLESHOOTING USING DATA MINING

    Directory of Open Access Journals (Sweden)

    Eleni Rozaki

    2015-05-01

    Full Text Available The efficient and effective monitoring of mobile networks is vital given the number of users who rely on such networks and the importance of those networks. The purpose of this paper is to present a monitoring scheme for mobile networks based on the use of rules and decision tree data mining classifiers to upgrade fault detection and handling. The goal is to have optimisation rules that improve anomaly detection. In addition, a monitoring scheme that relies on Bayesian classifiers was also implemented for the purpose of fault isolation and localisation. The data mining techniques described in this paper are intended to allow a system to be trained to actually learn network fault rules. The results of the tests that were conducted allowed for the conclusion that the rules were highly effective to improve network troubleshooting.

  15. On-Chip Network Design Automation with Source Routing Switches

    Institute of Scientific and Technical Information of China (English)

    MA Liwei; SUN Yihe

    2007-01-01

    Network-on-chip (NoC) is a new design paradigm for system-on-chip intraconnections in the billion-transistor era. Application specific on-chip network design is essential for NoC success in this new era.This paper presents a class of source routing switch that can be used to efficiently form arbitrary network topologies and that can be optimized for various applications. Hardware description language versions of the networks can be generated automatically for simulations and for syntheses. A series of switches and networks has been configured with their performances including latency, delay, area, and power, and analyzed theoretically and experimentally. The results show that this NoC architecture provides a large design space for application specific on-chip network designs.

  16. Remotely Managing Operation, Data Collection and processing in Modern Automated ET Networks

    Science.gov (United States)

    Johnson, D.; Xu, L.; Li, J.; Yuan, G.; Sun, X.; Zhu, Z.; Tang, X.; Velgersdyk, M.; Beaty, K.; Fratini, G.; Kathilankal, J. C.; Burba, G. G.

    2014-12-01

    The significant increase in overall data generation and available computing power in the recent years has greatly improved spatial and temporal data coverage of evapotranspiration (ET) measurements on multiple scales, ranging from a single station to continental scale ET networks. With the increased number of ET stations and increased amount of data flowing from each station, modern tools are needed to effectively and efficiently handle the entire infrastructure (hardware, software and data management). These tools can automate key stages of ET network operation, remotely providing real-time ET rates and alerts for the health of the instruments. This can help maximize time dedicated to answering research questions, rather than to station management. This year, the Chinese Ecosystem Research Network (CERN) within the Chinese Academy of Sciences implemented a large-scale 27-station national ET network across China to measure and understand the water cycle from a variety of ecosystems. It includes automated eddy covariance systems, on-site flux computations, wireless communication, and a network server for system, data, and user management. This presentation will discuss the latest information on the CERN network, methods and hardware for ET measurements, tools for automated data collection, data processing and quality control, and data transport and management of the multiple stations. This system description is beneficial for individuals and institutions interested in setting up or modifying present ET networks consisting of single or multiple stations spread over geographic locations ranging from single field site or watershed to national or continental scale.

  17. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732

  18. Comparative analysis of collaboration networks

    International Nuclear Information System (INIS)

    In this paper we carry out a comparative analysis of the word network as the collaboration network based on the novel by M. Bulgakov 'Master and Margarita', the synonym network of the Russian language as well as the Russian movie actor network. We have constructed one-mode projections of these networks, defined degree distributions for them and have calculated main characteristics. In the paper a generation algorithm of collaboration networks has been offered which allows one to generate networks statistically equivalent to the studied ones. It lets us reveal a structural correlation between word network, synonym network and movie actor network. We show that the degree distributions of all analyzable networks are described by the distribution of q-type.

  19. Multielement and automated radiochemical separation procedures for activation analysis

    International Nuclear Information System (INIS)

    In recent years the demand for information about the distribution of elements at trace concentration levels in high purity materials and in biological, environmental and geological specimens has increased greatly. Neutron activation analysis can play an important role in obtaining the required information. Radiochemical separations are required in many of the applications mentioned. A critical review of the progress made over the last 15 years in the development and application of radiochemical separation schemes for multielement activation analysis and in their automation is presented. About 80 radiochemical separation schemes are reviewed. Advantages and disadvantages of the automation of radiochemical separations are critically analysed. The various machines developed are illustrated and technical suggestions for the development of automated machines are given. (author)

  20. Automated construction of generalized additive neural networks for predictive data mining / Jan Valentine du Toit

    OpenAIRE

    Du Toit, Jan Valentine

    2006-01-01

    In this thesis Generalized Additive Neural Networks (GANNs) are studied in the context of predictive Data Mining. A GANN is a novel neural network implementation of a Generalized Additive Model. Originally GANNs were constructed interactively by considering partial residual plots. This methodology involves subjective human judgment, is time consuming, and can result in suboptimal results. The newly developed automated construction algorithm solves these difficulties by performing mod...

  1. Routes visualization: Automated placement of multiple route symbols along a physical network infrastructure

    OpenAIRE

    Jules Teulade-Denantes; Adrien Maudet; Cécile Duchêne

    2015-01-01

    This paper tackles the representation of routes carried by a physical network infrastructure on a map. In particular, the paper examines the case where each route is represented by a separate colored linear symbol offset from the physical network segments and from other routes—as on public transit maps with bus routes offset from roads. In this study, the objective is to automate the placement of such route symbols while maximizing their legibility, especially at junctions. The problem is mod...

  2. User-friendly establishment of trust in distributed home automation networks

    DEFF Research Database (Denmark)

    Solberg Hjorth, Theis; Madsen, Per Printz; Torbensen, Rune Sonnich

    2012-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup th...

  3. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  4. SensorScheme: Supply chain management automation using Wireless Sensor Networks

    NARCIS (Netherlands)

    Evers, L.; Havinga, P.J.M.; Kuper, J.; Lijding, M.E.M.; Meratnia, N.

    2007-01-01

    The supply chain management business can benefit greatly from automation, as recent developments with RFID technology shows. The use of Wireless Sensor Network technology promises to bring the next leap in efficiency and quality of service. However, current WSN system software does not yet provide t

  5. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Directory of Open Access Journals (Sweden)

    August Betzler

    2014-08-01

    Full Text Available Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  6. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  7. PollyNET: a global network of automated Raman-polarization lidars for continuous aerosol profiling

    Directory of Open Access Journals (Sweden)

    H. Baars

    2015-10-01

    Full Text Available A global vertically resolved aerosol data set covering more than 10 years of observations at more than 20 measurement sites distributed from 63° N to 52° S and 72° W to 124° E has been achieved within the Raman and polarization lidar network PollyNET. This network consists of portable, remote-controlled multiwavelength-polarization-Raman lidars (Polly for automated and continuous 24/7 observations of clouds and aerosols. PollyNET is an independent, voluntary, and scientific network. All Polly lidars feature a standardized instrument design and apply unified calibration, quality control, and data analysis. The observations are processed in near-real time without manual intervention, and are presented online at http://polly.tropos.de. The paper gives an overview of the observations on four continents and two research vessels obtained with eight Polly systems. The specific aerosol types at these locations (mineral dust, smoke, dust-smoke and other dusty mixtures, urban haze, and volcanic ash are identified by their Ångström exponent, lidar ratio, and depolarization ratio. The vertical aerosol distribution at the PollyNET locations is discussed on the basis of more than 55 000 automatically retrieved 30 min particle backscatter coefficient profiles at 532 nm. A seasonal analysis of measurements at selected sites revealed typical and extraordinary aerosol conditions as well as seasonal differences. These studies show the potential of PollyNET to support the establishment of a global aerosol climatology that covers the entire troposphere.

  8. AUTOMATED IRRIGATION SYSTEM USING WIRELESS SENSOR NETWORK AND RFMODULE

    Directory of Open Access Journals (Sweden)

    Nagare Vrushali M

    2015-06-01

    Full Text Available Fresh water is the basic need of living organisms on earth. The fresh water is consumed by living beings to be alive including plants and animals. The amount of fresh water available is limited.Also; population has increased as compared to available water and food resources. Agriculture consumes about 85% of the total fresh water quantity available and hence, there is an urgent need to create strategies based on science and technology for sustainable use of water, including technical, agronomic, managerial & institutional improvements. There are many systems using various techniques to achieve water savings in various agricultural practices. The system using remote access and wireless communication is discussed in this paper. The system explained here is a n etwork of wireless sensors and a wireless base station to process the sensor data to automate the irrigation system. The sensors are soil moisture sensor and soil temperature sensor. The Base station microcontroller is programmed such that if the eit her so il moisture or temperature parameters cross a predefined threshold level, the irrigation system is automated, i.e. the motor relay that is connected to water pump, switches to ON otherwise OFF.

  9. Hierarchical polynomial network approach to automated target recognition

    Science.gov (United States)

    Kim, Richard Y.; Drake, Keith C.; Kim, Tony Y.

    1994-02-01

    A hierarchical recognition methodology using abductive networks at several levels of object recognition is presented. Abductive networks--an innovative numeric modeling technology using networks of polynomial nodes--results from nearly three decades of application research and development in areas including statistical modeling, uncertainty management, genetic algorithms, and traditional neural networks. The systems uses pixel-registered multisensor target imagery provided by the Tri-Service Laser Radar sensor. Several levels of recognition are performed using detection, classification, and identification, each providing more detailed object information. Advanced feature extraction algorithms are applied at each recognition level for target characterization. Abductive polynomial networks process feature information and situational data at each recognition level, providing input for the next level of processing. An expert system coordinates the activities of individual recognition modules and enables employment of heuristic knowledge to overcome the limitations provided by a purely numeric processing approach. The approach can potentially overcome limitations of current systems such as catastrophic degradation during unanticipated operating conditions while meeting strict processing requirements. These benefits result from implementation of robust feature extraction algorithms that do not take explicit advantage of peculiar characteristics of the sensor imagery, and the compact, real-time processing capability provided by abductive polynomial networks.

  10. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Directory of Open Access Journals (Sweden)

    Jianfang Cao

    2015-01-01

    Full Text Available With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  11. Fuzzy emotional semantic analysis and automated annotation of scene images.

    Science.gov (United States)

    Cao, Jianfang; Chen, Lichao

    2015-01-01

    With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP) neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance. PMID:25838818

  12. Multifractal analysis of complex networks

    International Nuclear Information System (INIS)

    Complex networks have recently attracted much attention in diverse areas of science and technology. Many networks such as the WWW and biological networks are known to display spatial heterogeneity which can be characterized by their fractal dimensions. Multifractal analysis is a useful way to systematically describe the spatial heterogeneity of both theoretical and experimental fractal patterns. In this paper, we introduce a new box-covering algorithm for multifractal analysis of complex networks. This algorithm is used to calculate the generalized fractal dimensions Dq of some theoretical networks, namely scale-free networks, small world networks, and random networks, and one kind of real network, namely protein—protein interaction networks of different species. Our numerical results indicate the existence of multifractality in scale-free networks and protein—protein interaction networks, while the multifractal behavior is not clear-cut for small world networks and random networks. The possible variation of Dq due to changes in the parameters of the theoretical network models is also discussed. (general)

  13. Performance Analysis of GAME: A Generic Automated Marking Environment

    Science.gov (United States)

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  14. Automated Analysis of Child Phonetic Production Using Naturalistic Recordings

    Science.gov (United States)

    Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill

    2014-01-01

    Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…

  15. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  16. Automated SEM-EDS GSR Analysis for Turkish Ammunitions

    International Nuclear Information System (INIS)

    In this work, Automated Scanning Electron Microscopy with Energy Dispersive X-ray Spectrometry (SEM-EDS) was used to characterize 7.65 and 9mm cartridges Turkish ammunition. All samples were analyzed in a SEM Jeol JSM-5600LV equipped BSE detector and a Link ISIS 300 (EDS). A working distance of 20mm, an accelerating voltage of 20 keV and gunshot residue software was used in all analysis. Automated search resulted in a high number of particles analyzed containing gunshot residues (GSR) unique elements (PbBaSb). The obtained data about the definition of characteristic GSR particles was concordant with other studies on this topic

  17. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  18. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  19. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  20. AUTOMATED POLICY COMPLIANCE AND CHANGE DETECTION MANAGED SERVICE IN DATA NETWORKS

    Directory of Open Access Journals (Sweden)

    Saeed M. Agbariah

    2013-11-01

    Full Text Available As networks continue to grow in size, speed and complexity, as well as in the diversification of their services, they require many ad-hoc configuration changes. Such changes may lead to potential configuration errors, policy violations, inefficiencies, and vulnerable states. The current Network Management landscape is in a dire need for an automated process to prioritize and manage risk, audit configurations against internal policies or external best practices, and provide centralize reporting for monitoring and regulatory purposes in real time. This paper defines a framework for automated configuration process with a policy compliance and change detection system, which performs automatic and intelligent network configuration audits by using pre-defined configuration templates and library of rules that encompass industry standards for various routing and security related guidelines.System administrators and change initiators will have a real time feedback if any of their configuration changes violate any of the policies set for any given device.

  1. Domotics – A Cost Effective Smart Home Automation System Using Wifi as Network Infrastructure

    Directory of Open Access Journals (Sweden)

    Abhinav Talgeri

    2014-08-01

    Full Text Available This paper describes an investigation into the potential for remote controlled operation of home automation (also called as Domotics systems. It considers problems with their implementation, discusses possible solutions through various network technologies and indicates how to optimize the use of such systems. This paper emphasizes on the design and prototype implementation of new home automation system that uses WiFi technology as a network infrastructure connecting its parts. The proposed can be viewed on two fold; the first part is the software (web server, which presents system core that manages, controls, and monitors users’ home. Users and system administrator can locally (LAN or remotely (internet manage the system code. Second part is hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of available home automation system in the market the proposed system is scalable that one server can manage many hardware interface modules as long as it exists on WiFi network coverage.

  2. Project Management Phases of a SCADA System for Automation of Electrical Distribution Networks

    Directory of Open Access Journals (Sweden)

    Mohamed Najeh Lakhoua

    2012-03-01

    Full Text Available The aim of this paper is, firstly, to recall the basic concepts of SCADA (Supervisory Control And Data Acquisition systems, to present the project management phases of SCADA for real time implementation, and then to show the need of the automation for Electricity Distribution Companies (EDC on their distribution networks and the importance of using computer based system towards sustainable development of their services. A proposed computer based power distribution automation system is then discussed. Finally, some projects of SCADA system implementation in electrical companies over the world are briefly presented.

  3. INSTRUMENTATION AND CONTROL FOR WIRELESS SENSOR NETWORK FOR AUTOMATED IRRIGATION

    Science.gov (United States)

    An in-field sensor-based irrigation system is of benefit to producers in efficient water management. A distributed wireless sensor network eliminates difficulties to wire sensor stations across the field and reduces maintenance cost. Implementing wireless sensor-based irrigation system is challengin...

  4. Computer automated movement detection for the analysis of behavior

    OpenAIRE

    Ramazani, Roseanna B.; Harish R Krishnan; BERGESON, SUSAN E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtractio...

  5. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation. The...... presented pipeline deals with i) estimation of the mid-sagittal plane, ii) localisation and registration of the corpus callosum, iii) parameterisation and representation of its contour, and iv) means of standardising the traditional reference area measurements....

  6. Initial Flight Results for an Automated Satellite Beacon Health Monitoring Network

    OpenAIRE

    Young, Anthony; Kitts, Christopher; Neumann, Michael; Mas, Ignacio; Rasay, Mike

    2010-01-01

    Beacon monitoring is an automated satellite health monitoring architecture that combines telemetry analysis, periodic low data rate message broadcasts by a spacecraft, and automated ground reception and data handling in order to implement a cost-effective anomaly detection and notification capability for spacecraft missions. Over the past two decades, this architecture has been explored and prototyped for a range of spacecraft mission classes to include use on NASA deep space probes, military...

  7. A Knowledge-Based Strategy for the Automated Support to Network Management Tasks

    Science.gov (United States)

    Abar, Sameera; Kinoshita, Tetsuo

    This paper presents a domain-ontology driven multi-agent based scheme for representing the knowledge of the communication network management system. In the proposed knowledge-intensive framework, the static domain-related concepts are articulated as the domain knowledge ontology. The experiential knowledge for managing the network is represented as the fault-case reasoning models, and it is explicitly encoded as the core knowledge of multi-agent middleware layer as heuristic production-type rules. These task-oriented management expertise manipulates the domain content and structure during the diagnostic sessions. The agents' rules along with the embedded generic java-based problem-solving algorithms and run-time log information, perform the automated management tasks. For the proof of concept, an experimental network system has been implemented in our laboratory, and the deployment of some test-bed scenarios is performed. Experimental results confirm a marked reduction in the management-overhead of the network administrator, as compared to the manual network management techniques, in terms of the time-taken and effort-done during a particular fault-diagnosis session. Validation of the reusability/modifiability aspects of our system, illustrates the flexible manipulation of the knowledge fragments within diverse application contexts. The proposed approach can be regarded as one of the pioneered steps towards representing the network knowledge via reusable domain ontology and intelligent agents for the automated network management support systems.

  8. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  9. Automated Protein Assay Using Flow Injection Analysis

    Science.gov (United States)

    Wolfe, Carrie A. C.; Oates, Matthew R.; Hage, David S.

    1998-08-01

    The technique of flow injection analysis (FIA) is a common instrumental method used in detecting a variety of chemical and biological agents. This paper describes an undergraduate laboratory that uses FIA to perform a bicinchoninic acid (BCA) colorimetric assay for quantitating protein samples. The method requires less than 2 min per sample injection and gives a response over a broad range of protein concentrations. This method can be used in instrumental analysis labs to illustrate the principles and use of FIA, or as a means for introducing students to common methods employed in the analysis of biological agents.

  10. Automating Deep Space Network scheduling and conflict resolution

    Science.gov (United States)

    Johnston, Mark D.; Clement, Bradley

    2005-01-01

    The Deep Space Network (DSN) is a central part of NASA's infrastructure for communicating with active space missions, from earth orbit to beyond the solar system. We describe our recent work in modeling the complexities of user requirements, and then scheduling and resolving conflicts on that basis. We emphasize our innovative use of background 'intelligent' assistants' that carry out search asynchrnously while the user is focusing on various aspects of the schedule.

  11. Automated Synthesis of Skew-Based Clock Distribution Networks

    OpenAIRE

    José Luis Neves; Eby G. Friedman

    1998-01-01

    In this paper a top-down methodology is presented for synthesizing clock distribution networks based on application-dependent localized clock skew. The methodology is divided into four phases: 1) determination of an optimal clock skew schedule for improving circuit performance and reliability; 2) design of the topology of the clock tree based on the circuit hierarchy and minimum clock path delays; 3) design of circuit structures to implement the delay values associated with the branches of th...

  12. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  13. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  14. Multifractal analysis of complex networks

    OpenAIRE

    Wang, Dan-Ling; Yu, Zu-Guo; Van Anh, Vo

    2011-01-01

    Complex networks have recently attracted much attention in diverse areas of science and technology. Many networks such as the WWW and biological networks are known to display spatial heterogeneity which can be characterized by their fractal dimensions. Multifractal analysis is a useful way to systematically describe the spatial heterogeneity of both theoretical and experimental fractal patterns. In this paper, we introduce a new box covering algorithm for multifractal analysis of complex netw...

  15. Network topology analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Kalb, Jeffrey L.; Lee, David S.

    2008-01-01

    Emerging high-bandwidth, low-latency network technology has made network-based architectures both feasible and potentially desirable for use in satellite payload architectures. The selection of network topology is a critical component when developing these multi-node or multi-point architectures. This study examines network topologies and their effect on overall network performance. Numerous topologies were reviewed against a number of performance, reliability, and cost metrics. This document identifies a handful of good network topologies for satellite applications and the metrics used to justify them as such. Since often multiple topologies will meet the requirements of the satellite payload architecture under development, the choice of network topology is not easy, and in the end the choice of topology is influenced by both the design characteristics and requirements of the overall system and the experience of the developer.

  16. Automating with ROBOCOM. An expert system for complex engineering analysis

    International Nuclear Information System (INIS)

    Nuclear engineering analysis is automated with the help of preprocessors and postprocessors. All the analysis and processing steps are recorded in a form that is reportable and replayable. These recordings serve both as documentations and as robots, for they are capable of performing the analyses they document. Since the processors and robots in ROBOCOM interface the users in a way independent of the analysis program being used, it is now possible to unify input modeling for programs with similar functionality. ROBOCOM will eventually evolve into an encyclopedia of how every nuclear engineering analysis is performed

  17. The automation of analysis of technological process effectiveness

    Directory of Open Access Journals (Sweden)

    B. Krupińska

    2007-10-01

    Full Text Available Purpose: Improvement of technological processes by the use of technological efficiency analysis can create basis of their optimization. Informatization and computerization of wider and wider scope of activity is one of the most important current development trends of an enterprise.Design/methodology/approach: Indicators appointment makes it possible to evaluate the process efficiency, which can constitute an optimization basis of particular operation. Model of technological efficiency analysis is based on particular efficiency indicators that characterize operation, taking into account following criteria: operation – material, operation – machine, operation – human, operation – technological parameters.Findings: From the qualitative and correctness of choose of technology point of view comprehensive technological processes assessment makes up the basis of technological efficiency analysis. Results of technological efficiency analysis of technological process of prove that the chosen model of technological efficiency analysis makes it possible to improve the process continuously by the technological analysis, and application of computer assistance makes it possible to automate the process of efficiency analysis, and finally controlled improvement of technological processes.Practical implications: For the sake of complexity of technological efficiency analysis one has created an AEPT computer analysis from which result: operation efficiency indicators with distinguished indicators with minimal acceptable values, values of efficiency of the applied samples, value of technological process efficiency.Originality/value: The created computer analysis of ef technological process efficiency (AEPT makes it possible to automate the process of analysis and optimization.

  18. Tank Farm Operations Surveillance Automation Analysis

    International Nuclear Information System (INIS)

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities

  19. Tank Farm Operations Surveillance Automation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MARQUEZ, D.L.

    2000-12-21

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities.

  20. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  1. Automated optics inspection analysis for NIF

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF) is a high-energy laser facility comprised of 192 beamlines that house thousands of optics. These optics guide, amplify and tightly focus light onto a tiny target for fusion ignition research and high energy density physics experiments. The condition of these optics is key to the economic, efficient and maximally energetic performance of the laser. Our goal, and novel achievement, is to find on the optics any imperfections while they are tens of microns in size, track them through time to see if they grow and if so, remove the optic and repair the single site so the entire optic can then be re-installed for further use on the laser. This paper gives an overview of the image analysis used for detecting, measuring, and tracking sites of interest on an optic while it is installed on the beamline via in situ inspection and after it has been removed for maintenance. In this way, the condition of each optic is monitored throughout the optic's lifetime. This overview paper will summarize key algorithms and technical developments for custom image analysis and processing and highlight recent improvements. (Associated papers will include more details on these issues.) We will also discuss the use of OI Analysis for daily operation of the NIF laser and its extension to inspection of NIF targets.

  2. User-friendly Establishment of Trust in Distributed Home Automation Networks

    DEFF Research Database (Denmark)

    Solberg Hjorth, Theis; Torbensen, Rune; Madsen, Per Printz

    2014-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup...... these relationships can lead to misconfiguration or breaches of security. We outline a security system for Home Automation called Trusted Domain that can establish and maintain cryptographically secure relationships between devices connected via IP-based networks and the Internet. Trust establishment is...... with sequences of pre-defined pictograms. This method is designed to scale from smartphones and tablets down to low-resource embedded systems. The presented approach is supported by an extensive literature study, and the ease of use and feasibility of the method has been investigated via user study and...

  3. Automated simultaneous analysis phylogenetics (ASAP: an enabling tool for phlyogenomics

    Directory of Open Access Journals (Sweden)

    Lee Ernest K

    2008-02-01

    Full Text Available Abstract Background The availability of sequences from whole genomes to reconstruct the tree of life has the potential to enable the development of phylogenomic hypotheses in ways that have not been before possible. A significant bottleneck in the analysis of genomic-scale views of the tree of life is the time required for manual curation of genomic data into multi-gene phylogenetic matrices. Results To keep pace with the exponentially growing volume of molecular data in the genomic era, we have developed an automated technique, ASAP (Automated Simultaneous Analysis Phylogenetics, to assemble these multigene/multi species matrices and to evaluate the significance of individual genes within the context of a given phylogenetic hypothesis. Conclusion Applications of ASAP may enable scientists to re-evaluate species relationships and to develop new phylogenomic hypotheses based on genome-scale data.

  4. Automated reasoning applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System

  5. Extended -Regular Sequence for Automated Analysis of Microarray Images

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2006-01-01

    Full Text Available Microarray study enables us to obtain hundreds of thousands of expressions of genes or genotypes at once, and it is an indispensable technology for genome research. The first step is the analysis of scanned microarray images. This is the most important procedure for obtaining biologically reliable data. Currently most microarray image processing systems require burdensome manual block/spot indexing work. Since the amount of experimental data is increasing very quickly, automated microarray image analysis software becomes important. In this paper, we propose two automated methods for analyzing microarray images. First, we propose the extended -regular sequence to index blocks and spots, which enables a novel automatic gridding procedure. Second, we provide a methodology, hierarchical metagrid alignment, to allow reliable and efficient batch processing for a set of microarray images. Experimental results show that the proposed methods are more reliable and convenient than the commercial tools.

  6. [Automated recognition of quasars based on adaptive radial basis function neural networks].

    Science.gov (United States)

    Zhao, Mei-Fang; Luo, A-Li; Wu, Fu-Chao; Hu, Zhan-Yi

    2006-02-01

    Recognizing and certifying quasars through the research on spectra is an important method in the field of astronomy. This paper presents a novel adaptive method for the automated recognition of quasars based on the radial basis function neural networks (RBFN). The proposed method is composed of the following three parts: (1) The feature space is reduced by the PCA (the principal component analysis) on the normalized input spectra; (2) An adaptive RBFN is constructed and trained in this reduced space. At first, the K-means clustering is used for the initialization, then based on the sum of squares errors and a gradient descent optimization technique, the number of neurons in the hidden layer is adaptively increased to improve the recognition performance; (3) The quasar spectra recognition is effectively carried out by the above trained RBFN. The author's proposed adaptive RBFN is shown to be able to not only overcome the difficulty of selecting the number of neurons in hidden layer of the traditional RBFN algorithm, but also increase the stability and accuracy of recognition of quasars. Besides, the proposed method is particularly useful for automatic voluminous spectra processing produced from a large-scale sky survey project, such as our LAMOST, due to its efficiency. PMID:16826929

  7. Automated analysis of Xe-133 pulmonary ventilation (AAPV) in children

    Science.gov (United States)

    Cao, Xinhua; Treves, S. Ted

    2011-03-01

    In this study, an automated analysis of pulmonary ventilation (AAPV) was developed to visualize the ventilation in pediatric lungs using dynamic Xe-133 scintigraphy. AAPV is a software algorithm that converts a dynamic series of Xe- 133 images into four functional images: equilibrium, washout halftime, residual, and clearance rate by analyzing pixelbased activity. Compared to conventional methods of calculating global or regional ventilation parameters, AAPV provides a visual representation of pulmonary ventilation functions.

  8. RFI detection by automated feature extraction and statistical analysis

    OpenAIRE

    Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorit...

  9. Automated analysis of damages for radiation in plastics surfaces

    International Nuclear Information System (INIS)

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  10. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  11. A Method of Automated Nonparametric Content Analysis for Social Science

    OpenAIRE

    Hopkins, Daniel J.; King, Gary

    2010-01-01

    The increasing availability of digitized text presents enormous opportunities for social scientists. Yet hand coding many blogs, speeches, government records, newspapers, or other sources of unstructured text is infeasible. Although computer scientists have methods for automated content analysis, most are optimized to classify individual documents, whereas social scientists instead want generalizations about the population of documents, such as the proportion in a given category. Unfortunatel...

  12. Administration of access rights to the corporate network with the integrated automation of the bank

    OpenAIRE

    Chaplyga, V.; Nyemkova, E.; Ivanishin, S.; Shandra, Z.

    2014-01-01

    The article is devoted to the administration of access rights in Role-Based model Access Control to the corporate banking networks. Four main functional roles are offered in accordance with the international standard CobiT. The problem of remote connectivity to information resources of banks in terms of security is similar to the problem BYOD. There are proposed an algorithm of the automated control of remote access. The algorithm is based on the separation of the work area in ...

  13. Efficient representation for formal verification of time performances of networked automation architectures

    OpenAIRE

    Ruel, Silvain; De Smet, Olivier; Faure, Jean-Marc

    2008-01-01

    Networked automation architectures with Ethernet-based fieldbuses instead of traditional fieldbuses are more and more often used in industry, even for critical systems such as chemical or nuclear power plants. The strong safety requirements of these processes impose to evaluate the time performances of these complex architectures. Formal verification techniques are promising solutions to reach this objective. Hence, this paper focuses on the applicability of formal verification techniques to ...

  14. WEB APPLICATION DEVELOPMENT FOR BUILDING AUTOMATION DEVICE (HEATING SYSTEM) IN LOCAL NETWORK

    OpenAIRE

    Shrestha, Jeveen

    2016-01-01

    ABSTRACT Oulu University of Applied Sciences Degree Programme in Information Technology Author: Jeveen Shrestha Title of the bachelor’s thesis: Web Application Development for Building Automation Device (Heating System) in Local Network Supervisor: Pekka Alaluukas Term and year of completion: Spring 2016 Pages: 37 After doing a practical training in Ouman Oy in summer of 2015, I was provided a project to develop a web application that would communicate to their heati...

  15. Automated Cardiac Beat Classification Using RBF Neural Networks

    Directory of Open Access Journals (Sweden)

    Ali Khazaee

    2013-04-01

    Full Text Available This paper proposes a four stage, denoising, feature extraction, optimization and classification method for detection of premature ventricular contractions. In the first stage, we investigate the application of wavelet denoising in noise reduction of multi-channel high resolution ECG signals. In this stage, the Stationary Wavelet Transform is used. Feature extraction module extracts ten ECG morphological features and one timing interval feature. Then a number of radial basis function (RBF neural networks with different value of spread parameter are designed and compared their ability for classification of three different classes of ECG signals. Genetic Algorithm is used to find best value of RBF parameters. A classification accuracy of 100% for training dataset and 95.66% for testing dataset and an overall accuracy of detection of 95.83% were achieved over seven files from the MIT/BIH arrhythmia database.

  16. Automated tumor analysis for molecular profiling in lung cancer.

    Science.gov (United States)

    Hamilton, Peter W; Wang, Yinhai; Boyd, Clinton; James, Jacqueline A; Loughrey, Maurice B; Hougton, Joseph P; Boyle, David P; Kelly, Paul; Maxwell, Perry; McCleary, David; Diamond, James; McArt, Darragh G; Tunstall, Jonathon; Bankhead, Peter; Salto-Tellez, Manuel

    2015-09-29

    The discovery and clinical application of molecular biomarkers in solid tumors, increasingly relies on nucleic acid extraction from FFPE tissue sections and subsequent molecular profiling. This in turn requires the pathological review of haematoxylin & eosin (H&E) stained slides, to ensure sample quality, tumor DNA sufficiency by visually estimating the percentage tumor nuclei and tumor annotation for manual macrodissection. In this study on NSCLC, we demonstrate considerable variation in tumor nuclei percentage between pathologists, potentially undermining the precision of NSCLC molecular evaluation and emphasising the need for quantitative tumor evaluation. We subsequently describe the development and validation of a system called TissueMark for automated tumor annotation and percentage tumor nuclei measurement in NSCLC using computerized image analysis. Evaluation of 245 NSCLC slides showed precise automated tumor annotation of cases using Tissuemark, strong concordance with manually drawn boundaries and identical EGFR mutational status, following manual macrodissection from the image analysis generated tumor boundaries. Automated analysis of cell counts for % tumor measurements by Tissuemark showed reduced variability and significant correlation (p tissue samples for molecular profiling in discovery and diagnostics. PMID:26317646

  17. Automated eddy current analysis of materials

    Science.gov (United States)

    Workman, Gary L.

    1991-01-01

    The use of eddy current techniques for characterizing flaws in graphite-based filament-wound cylindrical structures is described. A major emphasis was also placed upon incorporating artificial intelligence techniques into the signal analysis portion of the inspection process. Developing an eddy current scanning system using a commercial robot for inspecting graphite structures (and others) was a goal in the overall concept and is essential for the final implementation for the expert systems interpretation. Manual scans, as performed in the preliminary work here, do not provide sufficiently reproducible eddy current signatures to be easily built into a real time expert system. The expert systems approach to eddy current signal analysis requires that a suitable knowledge base exist in which correct decisions as to the nature of a flaw can be performed. A robotic workcell using eddy current transducers for the inspection of carbon filament materials with improved sensitivity was developed. Improved coupling efficiencies achieved with the E-probes and horseshoe probes are exceptional for graphite fibers. The eddy current supervisory system and expert system was partially developed on a MacIvory system. Continued utilization of finite element models for predetermining eddy current signals was shown to be useful in this work, both for understanding how electromagnetic fields interact with graphite fibers, and also for use in determining how to develop the knowledge base. Sufficient data was taken to indicate that the E-probe and the horseshoe probe can be useful eddy current transducers for inspecting graphite fiber components. The lacking component at this time is a large enough probe to have sensitivity in both the far and near field of a thick graphite epoxy component.

  18. Location-Based Self-Adaptive Routing Algorithm for Wireless Sensor Networks in Home Automation

    OpenAIRE

    Hong SeungHo; Li XiaoHui; Fang KangLing

    2011-01-01

    The use of wireless sensor networks in home automation (WSNHA) is attractive due to their characteristics of self-organization, high sensing fidelity, low cost, and potential for rapid deployment. Although the AODVjr routing algorithm in IEEE 802.15.4/ZigBee and other routing algorithms have been designed for wireless sensor networks, not all are suitable for WSNHA. In this paper, we propose a location-based self-adaptive routing algorithm for WSNHA called WSNHA-LBAR. It confines route disco...

  19. Tourism Destinations Network Analysis, Social Network Analysis Approach

    Directory of Open Access Journals (Sweden)

    2015-09-01

    Full Text Available The tourism industry is becoming one of the world's largest economical sources, and is expected to become the world's first industry by 2020. Previous studies have focused on several aspects of this industry including sociology, geography, tourism management and development, but have paid less attention to analytical and quantitative approaches. This study introduces some network analysis techniques and measures aiming at studying the structural characteristics of tourism networks. More specifically, it presents a methodology to analyze tourism destinations network. We apply the methodology to analyze mazandaran’s Tourism destination network, one of the most famous tourism areas of Iran.

  20. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  1. Community analysis in social networks

    OpenAIRE

    Arenas, Alex; Danon, Leon; Diaz-Guilera, Albert; Gleiser, Pablo M.; Guimera, Roger

    2003-01-01

    We present an empirical study of different social networks obtained from digital repositories. Our analysis reveals the community structure and provides a useful visualising technique. We investigate the scaling properties of the community size distribution, and that find all the networks exhibit power law scaling in the community size distributions with exponent either -0.5 or -1. Finally we find that the networks' community structure is topologically self-similar using the Horton-Strahler i...

  2. Automated morphological classification of APM galaxies by supervised artificial neural networks

    CERN Document Server

    Naim, A; Sodré, L; Storrie-Lombardi, M C; Naim, A; Lahav, O; Sodre, L; Storrie-Lombardi, M C

    1995-01-01

    We train Artificial Neural Networks to classify galaxies based solely on the morphology of the galaxy images as they appear on blue survey plates. The images are reduced and morphological features such as bulge size and the number of arms are extracted, all in a fully automated manner. The galaxy sample was first classified by 6 independent experts. We use several definitions for the mean type of each galaxy, based on those classifications. We then train and test the network on these features. We find that the rms error of the network classifications, as compared with the mean types of the expert classifications, is 1.8 Revised Hubble Types. This is comparable to the overall rms dispersion between the experts. This result is robust and almost completely independent of the network architecture used.

  3. Design of Networked Home Automation System Based on μCOS-II and AMAZON

    Directory of Open Access Journals (Sweden)

    Liu Jianfeng

    2015-01-01

    Full Text Available In recent years, with the popularity of computers and smart phones and the development of intelligent building in electronics industry, people’s requirement of living environment is gradually changing. The intelligent home furnishing building has become the new focus of people purchasing. And the networked home automation system which relies on the advanced network technology to connect with air conditioning, lighting, security, curtains, TV, water heater and other home furnishing systems into a local area network becomes a networked control system. μC /OS is a real-time operating system with the free open-source code, the compact structure and the preemptive real-time kernel. In this paper, the author focuses on the design of home furnishing total controller based on AMAZON multimedia processor and μC/OS-II real-time operating system, and achieves the remote access connection and control through the Ethernet.

  4. Network Topology Availability Analysis

    Directory of Open Access Journals (Sweden)

    N. Krajnovic

    2011-06-01

    Full Text Available In this paper the availabilities of some network topologies are analyzed and calculated. Software based on exact All-terminal Graph reduction Algorithm is developed and applied. Results, conclusions and solutions are presented, demonstrated and discussed.

  5. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  6. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    Directory of Open Access Journals (Sweden)

    JangMook Kang

    2010-09-01

    Full Text Available In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  7. Automated construction of node software using attributes in a ubiquitous sensor network environment.

    Science.gov (United States)

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678

  8. Enhancing the Authentication of Bank Cheque Signatures by Implementing Automated System Using Recurrent Neural Network

    Directory of Open Access Journals (Sweden)

    Mukta Rao

    2009-07-01

    Full Text Available The associative memory feature of the Hopfield type recurrent neural network is used for the pattern storage and pattern authentication. This paper outlines an optimization relaxation approach for signature verification based on the Hopfield neural network (HNN which is a recurrent network. The standard sample signature of the customer is cross matched with the one supplied on the Cheque. The difference percentage is obtained by calculating the different pixels in both the images. The network topology is built so that each pixel in the difference image is a neuron in the network. Each neuron is categorized by its states, which in turn signifies that if the particular pixel is changed. The network converges to unwavering condition based on the energy function which is derived in experiments. The Hopfield’s model allows each node to take on two binary state values (changed/unchanged for each pixel. The performance of the proposed technique is evaluated by applying it in various binary and gray scale images. This paper contributes in finding an automated scheme for verification of authentic signature on bank Cheques. The derived energy function allows a trade-off between the influence of its neighborhood and its own criterion. This device is able to recall as well as complete partially specified inputs. The network is trained via a storage prescription that forces stable states to correspond to (local minima of a network “energy” function.

  9. Handbook of Network Analysis [KONECT -- the Koblenz Network Collection

    OpenAIRE

    Kunegis, Jérôme

    2014-01-01

    This is the Handbook of Network Analysis, the companion article to the KONECT (Koblenz Network Collection) project. This project is intended to collect network datasets, analyse them systematically, and provide both datasets and the underlying network analysis code to researchers. This article outlines the project, gives all definitions used within the project, reviews all network statistics used, reviews all network plots used, and gives a brief overview of the API used by KONECT.

  10. Web Services Dependency Networks Analysis

    CERN Document Server

    Cherifi1, Chantal; Santucci, Jean-François

    2013-01-01

    Along with a continuously growing number of publicly available Web services (WS), we are witnessing a rapid development in semantic-related web technologies, which lead to the apparition of semantically described WS. In this work, we perform a comparative analysis of the syntactic and semantic approaches used to describe WS, from a complex network perspective. First, we extract syntactic and semantic WS dependency networks from a collection of publicly available WS descriptions. Then, we take advantage of tools from the complex network field to analyze them and determine their topological properties. We show WS dependency networks exhibit some of the typical characteristics observed in real-world networks, such as small world and scale free properties, as well as community structure. By comparing syntactic and semantic networks through their topological properties, we show the introduction of semantics in WS description allows modeling more accurately the dependencies between parameters, which in turn could l...

  11. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  12. Highway Electrification And Automation Technologies - Regional Impacts Analysis Project: Phase I: Baseline Scenario Data Analysis

    OpenAIRE

    Scag; Path

    1993-01-01

    The Highway Electrification and Automation Technologies Regional Impacts Analysis Project addresses the transportation-related problems of freeway congestion, air pollution, and dependence on fossil fuels in southern California. This report presents a documentation of the basis for the impacts analysis. It contains sections on data collected, baseline forecast for 2025, and electrification and automation specification scenarios. This report constitutes the final report for Phase I of the proj...

  13. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  14. AMDA: an R package for the automated microarray data analysis

    Directory of Open Access Journals (Sweden)

    Foti Maria

    2006-07-01

    Full Text Available Abstract Background Microarrays are routinely used to assess mRNA transcript levels on a genome-wide scale. Large amount of microarray datasets are now available in several databases, and new experiments are constantly being performed. In spite of this fact, few and limited tools exist for quickly and easily analyzing the results. Microarray analysis can be challenging for researchers without the necessary training and it can be time-consuming for service providers with many users. Results To address these problems we have developed an automated microarray data analysis (AMDA software, which provides scientists with an easy and integrated system for the analysis of Affymetrix microarray experiments. AMDA is free and it is available as an R package. It is based on the Bioconductor project that provides a number of powerful bioinformatics and microarray analysis tools. This automated pipeline integrates different functions available in the R and Bioconductor projects with newly developed functions. AMDA covers all of the steps, performing a full data analysis, including image analysis, quality controls, normalization, selection of differentially expressed genes, clustering, correspondence analysis and functional evaluation. Finally a LaTEX document is dynamically generated depending on the performed analysis steps. The generated report contains comments and analysis results as well as the references to several files for a deeper investigation. Conclusion AMDA is freely available as an R package under the GPL license. The package as well as an example analysis report can be downloaded in the Services/Bioinformatics section of the Genopolis http://www.genopolis.it/

  15. Computational Social Network Analysis

    CERN Document Server

    Hassanien, Aboul-Ella

    2010-01-01

    Presents insight into the social behaviour of animals (including the study of animal tracks and learning by members of the same species). This book provides web-based evidence of social interaction, perceptual learning, information granulation and the behaviour of humans and affinities between web-based social networks

  16. Automation of Some Operations of a Wind Tunnel Using Artificial Neural Networks

    Science.gov (United States)

    Decker, Arthur J.; Buggele, Alvin E.

    1996-01-01

    Artificial neural networks were used successfully to sequence operations in a small, recently modernized, supersonic wind tunnel at NASA-Lewis Research Center. The neural nets generated correct estimates of shadowgraph patterns, pressure sensor readings and mach numbers for conditions occurring shortly after startup and extending to fully developed flow. Artificial neural networks were trained and tested for estimating: sensor readings from shadowgraph patterns, shadowgraph patterns from shadowgraph patterns and sensor readings from sensor readings. The 3.81 by 10 in. (0.0968 by 0.254 m) tunnel was operated with its mach 2.0 nozzle, and shadowgraph was recorded near the nozzle exit. These results support the thesis that artificial neural networks can be combined with current workstation technology to automate wind tunnel operations.

  17. Routes visualization: Automated placement of multiple route symbols along a physical network infrastructure

    Directory of Open Access Journals (Sweden)

    Jules Teulade-Denantes

    2015-12-01

    Full Text Available This paper tackles the representation of routes carried by a physical network infrastructure on a map. In particular, the paper examines the case where each route is represented by a separate colored linear symbol offset from the physical network segments and from other routes—as on public transit maps with bus routes offset from roads. In this study, the objective is to automate the placement of such route symbols while maximizing their legibility, especially at junctions. The problem is modeled as a constraint optimization problem. Legibility criteria are identified and formalized as constraints to optimize, while focusing on the case of hiking routes in a physical network composed of roads and pedestrian paths. Two solving methods are tested, based on backtracking and simulated annealing meta-heuristics respectively. Encouraging results obtained on real data are presented and discussed.

  18. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    Directory of Open Access Journals (Sweden)

    Demir Sumeyra U

    2012-12-01

    Full Text Available Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD

  19. Attempt of automated space network operations at ETS-VI experimental data relay system

    Science.gov (United States)

    Ishihara, Kiyoomi; Sugawara, Masayuki

    1994-01-01

    National Space Development Agency of Japan (NASDA) is to perform experimental operations to acquire necessary technology for the future inter-satellite communications configured with a data relay satellite. This paper intends to overview functions of the experimental ground system which NASDA has developed for the Engineering Test Satellite VI (ETS-VI) Data Relay and Tracking Experiment, and to introduce Space Network System Operations Procedure (SNSOP) method with an example of Ka-band Single Access (KSA) acquisition sequence. To reduce operational load, SNSOP is developed with the concept of automated control and monitor of both ground terminal and data relay satellite. To perform acquisition and tracking operations fluently, the information exchange with user spacecraft controllers is automated by SNSOP functions.

  20. Tourism Destinations Network Analysis, Social Network Analysis Approach

    OpenAIRE

    2015-01-01

    The tourism industry is becoming one of the world's largest economical sources, and is expected to become the world's first industry by 2020. Previous studies have focused on several aspects of this industry including sociology, geography, tourism management and development, but have paid less attention to analytical and quantitative approaches. This study introduces some network analysis techniques and measures aiming at studying the structural characteristics of tourism networks. More speci...

  1. Topological analysis of telecommunications networks

    Directory of Open Access Journals (Sweden)

    Milojko V. Jevtović

    2011-01-01

    Full Text Available A topological analysis of the structure of telecommunications networks is a very interesting topic in the network research, but also a key issue in their design and planning. Satisfying multiple criteria in terms of locations of switching nodes as well as their connectivity with respect to the requests for capacity, transmission speed, reliability, availability and cost are the main research objectives. There are three ways of presenting the topology of telecommunications networks: table, matrix or graph method. The table method is suitable for a network of a relatively small number of nodes in relation to the number of links. The matrix method involves the formation of a connection matrix in which its columns present source traffic nodes and its rows are the switching systems that belong to the destination. The method of the topology graph means that the network nodes are connected via directional or unidirectional links. We can thus easily analyze the structural parameters of telecommunications networks. This paper presents the mathematical analysis of the star-, ring-, fully connected loop- and grid (matrix-shaped topology as well as the topology based on the shortest path tree. For each of these topologies, the expressions for determining the number of branches, the middle level of reliability, the medium length and the average length of the link are given in tables. For the fully connected loop network with five nodes the values of all topological parameters are calculated. Based on the topological parameters, the relationships that represent integral and distributed indicators of reliability are given in this work as well as the values of the particular network. The main objectives of the topology optimization of telecommunications networks are: achieving the minimum complexity, maximum capacity, the shortest path message transfer, the maximum speed of communication and maximum economy. The performance of telecommunications networks is

  2. Automated differential photometry of TAOS data: preliminary analysis

    CERN Document Server

    Ricci, D; Ayala, C; Ramón-Fox, F G; Michel, R; Navarro, S; Wang, S -Y; Zhang, Z -W; Lehner, M J; Nicastro, L; Reyes-Ruiz, M

    2014-01-01

    A preliminary data analysis of the stellar light curves obtained by the robotic telescopes of the TAOS project is presented. We selected a data run relative to one of the stellar fields observed by three of the four TAOS telescopes, and we investigate the common trend and the correlation between the light curves. We propose two ways to remove these trends and show the preliminary results. A project aimed at flagging interesting behaviors, such as stellar variability, and to set up an automated follow-up with the San Pedro M\\'artir Facilities is on the way.

  3. Analysis and simulation of a torque assist automated manual transmission

    Science.gov (United States)

    Galvagno, E.; Velardocchia, M.; Vigliani, A.

    2011-08-01

    The paper presents the kinematic and dynamic analysis of a power-shift automated manual transmission (AMT) characterised by a wet clutch, called assist clutch (ACL), replacing the fifth gear synchroniser. This torque assist mechanism becomes a torque transfer path during gearshifts, in order to overcome a typical dynamic problem of the AMTs, that is the driving force interruption. The mean power contributions during gearshifts are computed for different engine and ACL interventions, thus allowing to draw considerations useful for developing the control algorithms. The simulation results prove the advantages in terms of gearshift quality and ride comfort of the analysed transmission.

  4. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Fiehn, Anne-Marie Kanstrup; Kristensson, Martin; Engel, Ulla;

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...... agreement between the four pathologists and the VG app was κ=0.71. CONCLUSION: In conclusion, the Visiopharm VG app is able to measure the thickness of a sub-epithelial collagenous band in colon biopsies with an accuracy comparable to the performance of a pathologist and thereby provides a promising...

  5. Automated kymograph analysis for profiling axonal transport of secretory granules.

    Science.gov (United States)

    Mukherjee, Amit; Jenkins, Brian; Fang, Cheng; Radke, Richard J; Banker, Gary; Roysam, Badrinath

    2011-06-01

    This paper describes an automated method to profile the velocity patterns of small organelles (BDNF granules) being transported along a selected section of axon of a cultured neuron imaged by time-lapse fluorescence microscopy. Instead of directly detecting the granules as in conventional tracking, the proposed method starts by generating a two-dimensional spatio-temporal map (kymograph) of the granule traffic along an axon segment. Temporal sharpening during the kymograph creation helps to highlight granule movements while suppressing clutter due to stationary granules. A voting algorithm defined over orientation distribution functions is used to refine the locations and velocities of the granules. The refined kymograph is analyzed using an algorithm inspired from the minimum set cover framework to generate multiple motion trajectories of granule transport paths. The proposed method is computationally efficient, robust to significant levels of noise and clutter, and can be used to capture and quantify trends in transport patterns quickly and accurately. When evaluated on a collection of image sequences, the proposed method was found to detect granule movement events with 94% recall rate and 82% precision compared to a time-consuming manual analysis. Further, we present a study to evaluate the efficacy of velocity profiling by analyzing the impact of oxidative stress on granule transport in which the fully automated analysis correctly reproduced the biological conclusion generated by manual analysis. PMID:21330183

  6. Automated target recognition and tracking using an optical pattern recognition neural network

    Science.gov (United States)

    Chao, Tien-Hsin

    1991-01-01

    The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.

  7. Components for automated microfluidics sample preparation and analysis

    Science.gov (United States)

    Archer, M.; Erickson, J. S.; Hilliard, L. R.; Howell, P. B., Jr.; Stenger, D. A.; Ligler, F. S.; Lin, B.

    2008-02-01

    The increasing demand for portable devices to detect and identify pathogens represents an interdisciplinary effort between engineering, materials science, and molecular biology. Automation of both sample preparation and analysis is critical for performing multiplexed analyses on real world samples. This paper selects two possible components for such automated portable analyzers: modified silicon structures for use in the isolation of nucleic acids and a sheath flow system suitable for automated microflow cytometry. Any detection platform that relies on the genetic content (RNA and DNA) present in complex matrices requires careful extraction and isolation of the nucleic acids in order to ensure their integrity throughout the process. This sample pre-treatment step is commonly performed using commercially available solid phases along with various molecular biology techniques that require multiple manual steps and dedicated laboratory space. Regardless of the detection scheme, a major challenge in the integration of total analysis systems is the development of platforms compatible with current isolation techniques that will ensure the same quality of nucleic acids. Silicon is an ideal candidate for solid phase separations since it can be tailored structurally and chemically to mimic the conditions used in the laboratory. For analytical purposes, we have developed passive structures that can be used to fully ensheath one flow stream with another. As opposed to traditional flow focusing methods, our sheath flow profile is truly two dimensional, making it an ideal candidate for integration into a microfluidic flow cytometer. Such a microflow cytometer could be used to measure targets captured on either antibody- or DNA-coated beads.

  8. Statistical analysis of brain network

    OpenAIRE

    Sala

    2013-01-01

    Recent developments in the complex networks analysis, based largely on graph theory, have been used to study the brain network organization. The brain is a complex system that can be represented by a graph. A graph is a mathematical representation which can be useful to study the connectivity of the brain. Nodes in the brain can be identified dividing its volume in regions of interest and links can be identified calculating a measure of dependence between pairs of regions whose ac...

  9. Automated quantitative analysis of ventilation-perfusion lung scintigrams

    International Nuclear Information System (INIS)

    An automated computer analysis of ventilation (Kr-81m) and perfusion (Tc-99m) lung images has been devised that produces a graphical image of the distribution of ventilation and perfusion, and of ventilation-perfusion ratios. The analysis has overcome the following problems: the identification of the midline between two lungs and the lung boundaries, the exclusion of extrapulmonary radioactivity, the superimposition of lung images of different sizes, and the format for presentation of the data. Therefore, lung images of different sizes and shapes may be compared with each other. The analysis has been used to develop normal ranges from 55 volunteers. Comparison of younger and older age groups of men and women show small but significant differences in the distribution of ventilation and perfusion, but no differences in ventilation-perfusion ratios

  10. Automated system for load flow prediction in power substations using artificial neural networks

    Directory of Open Access Journals (Sweden)

    Arlys Michel Lastre Aleaga

    2015-09-01

    Full Text Available The load flow is of great importance in assisting the process of decision making and planning of generation, distribution and transmission of electricity. Ignorance of the values in this indicator, as well as their inappropriate prediction, difficult decision making and efficiency of the electricity service, and can cause undesirable situations such as; the on demand, overheating of the components that make up a substation, and incorrect planning processes electricity generation and distribution. Given the need for prediction of flow of electric charge of the substations in Ecuador this research proposes the concept for the development of an automated prediction system employing the use of Artificial Neural Networks.

  11. A New Modular Strategy For Action Sequence Automation Using Neural Networks And Hidden Markov Models

    OpenAIRE

    Mohamed Adel Taher; Mostapha Abdeljawad

    2013-01-01

    In this paper, the authors propose a new hybrid strategy (using artificial neural networks and hidden Markov models) for skill automation. The strategy is based on the concept of using an “adaptive desired†that is introduced in the paper. The authors explain how using an adaptive desired can help a system for which an explicit model is not available or is difficult to obtain to smartly cope with environmental disturbances without requiring explicit rules specification (as with fuzzy syste...

  12. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities.

    Science.gov (United States)

    Gomez, Carles; Paradells, Josep

    2015-01-01

    Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost. PMID:26378534

  13. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities

    Directory of Open Access Journals (Sweden)

    Carles Gomez

    2015-09-01

    Full Text Available Urban Automation Networks (UANs are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost.

  14. Tri-Band PCB Antenna for Wireless Sensor Network Transceivers in Home Automation Applications

    DEFF Research Database (Denmark)

    Rohde, John; Toftegaard, Thomas Skjødeberg

    2012-01-01

    A novel tri-band antenna design for wireless sensor network devices in home automation applications is proposed. The design is based on a combination of a conventional monopole wire antenna and discrete distributed load impedances. The load impedances are employed to ensure the degrees of freedom...... necessary to obtain a simultaneous optimization of input impedance and current distribution on the antenna structure. A transmission line model is presented together with 3D finite-element simulation results and measurements on the physical antenna....

  15. Analysis of network by generalized mutual entropies

    OpenAIRE

    Gudkov, V.; Montealegre, V.

    2007-01-01

    Generalized mutual entropy is defined for networks and applied for analysis of complex network structures. The method is tested for the case of computer simulated scale free networks, random networks, and their mixtures. The possible applications for real network analysis are discussed.

  16. Topology analysis of social networks extracted from literature.

    Directory of Open Access Journals (Sweden)

    Michaël C Waumans

    Full Text Available In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author's oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel's story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network's evolution over the course of the story.

  17. Topology analysis of social networks extracted from literature.

    Science.gov (United States)

    Waumans, Michaël C; Nicodème, Thibaut; Bersini, Hugues

    2015-01-01

    In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author's oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel's story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network's evolution over the course of the story. PMID:26039072

  18. Analysis of neural networks

    CERN Document Server

    Heiden, Uwe

    1980-01-01

    The purpose of this work is a unified and general treatment of activity in neural networks from a mathematical pOint of view. Possible applications of the theory presented are indica­ ted throughout the text. However, they are not explored in de­ tail for two reasons : first, the universal character of n- ral activity in nearly all animals requires some type of a general approach~ secondly, the mathematical perspicuity would suffer if too many experimental details and empirical peculiarities were interspersed among the mathematical investigation. A guide to many applications is supplied by the references concerning a variety of specific issues. Of course the theory does not aim at covering all individual problems. Moreover there are other approaches to neural network theory (see e.g. Poggio-Torre, 1978) based on the different lev­ els at which the nervous system may be viewed. The theory is a deterministic one reflecting the average be­ havior of neurons or neuron pools. In this respect the essay is writt...

  19. AUTOMATED DATA ANALYSIS FOR CONSECUTIVE IMAGES FROM DROPLET COMBUSTION EXPERIMENTS

    Directory of Open Access Journals (Sweden)

    Christopher Lee Dembia

    2012-09-01

    Full Text Available A simple automated image analysis algorithm has been developed that processes consecutive images from high speed, high resolution digital images of burning fuel droplets. The droplets burn under conditions that promote spherical symmetry. The algorithm performs the tasks of edge detection of the droplet’s boundary using a grayscale intensity threshold, and shape fitting either a circle or ellipse to the droplet’s boundary. The results are compared to manual measurements of droplet diameters done with commercial software. Results show that it is possible to automate data analysis for consecutive droplet burning images even in the presence of a significant amount of noise from soot formation. An adaptive grayscale intensity threshold provides the ability to extract droplet diameters for the wide range of noise encountered. In instances where soot blocks portions of the droplet, the algorithm manages to provide accurate measurements if a circle fit is used instead of an ellipse fit, as an ellipse can be too accommodating to the disturbance.

  20. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  1. Automated bony region identification using artificial neural networks: reliability and validation measurements

    International Nuclear Information System (INIS)

    The objective was to develop tools for automating the identification of bony structures, to assess the reliability of this technique against manual raters, and to validate the resulting regions of interest against physical surface scans obtained from the same specimen. Artificial intelligence-based algorithms have been used for image segmentation, specifically artificial neural networks (ANNs). For this study, an ANN was created and trained to identify the phalanges of the human hand. The relative overlap between the ANN and a manual tracer was 0.87, 0.82, and 0.76, for the proximal, middle, and distal index phalanx bones respectively. Compared with the physical surface scans, the ANN-generated surface representations differed on average by 0.35 mm, 0.29 mm, and 0.40 mm for the proximal, middle, and distal phalanges respectively. Furthermore, the ANN proved to segment the structures in less than one-tenth of the time required by a manual rater. The ANN has proven to be a reliable and valid means of segmenting the phalanx bones from CT images. Employing automated methods such as the ANN for segmentation, eliminates the likelihood of rater drift and inter-rater variability. Automated methods also decrease the amount of time and manual effort required to extract the data of interest, thereby making the feasibility of patient-specific modeling a reality. (orig.)

  2. The space physics analysis network

    Science.gov (United States)

    Green, James L.

    1988-04-01

    The Space Physics Analysis Network, or SPAN, is emerging as a viable method for solving an immediate communication problem for space and Earth scientists and has been operational for nearly 7 years. SPAN and its extension into Europe, utilizes computer-to-computer communications allowing mail, binary and text file transfer, and remote logon capability to over 1000 space science computer systems. The network has been used to successfully transfer real-time data to remote researchers for rapid data analysis but its primary function is for non-real-time applications. One of the major advantages for using SPAN is its spacecraft mission independence. Space science researchers using SPAN are located in universities, industries and government institutions all across the United States and Europe. These researchers are in such fields as magnetospheric physics, astrophysics, ionosperic physics, atmospheric physics, climatology, meteorology, oceanography, planetary physics and solar physics. SPAN users have access to space and Earth science data bases, mission planning and information systems, and computational facilities for the purposes of facilitating correlative space data exchange, data analysis and space research. For example, the National Space Science Data Center (NSSDC), which manages the network, is providing facilities on SPAN such as the Network Information Center (SPAN NIC). SPAN has interconnections with several national and international networks such as HEPNET and TEXNET forming a transparent DECnet network. The combined total number of computers now reachable over these combined networks is about 2000. In addition, SPAN supports full function capabilities over the international public packet switched networks (e.g. TELENET) and has mail gateways to ARPANET, BITNET and JANET.

  3. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  4. Automated Signature Creator for a Signature Based Intrusion Detection System with Network Attack Detection Capabilities (Pancakes

    Directory of Open Access Journals (Sweden)

    Frances Bernadette C. De Ocampo

    2015-05-01

    Full Text Available Signature-based Intrusion Detection System (IDS helps in maintaining the integrity of data in a network controlled environment. Unfortunately, this type of IDS depends on predetermined intrusion patterns that are manually created. If the signature database of the Signature-based IDS is not updated, network attacks just pass through this type of IDS without being noticed. To avoid this, an Anomaly-based IDS is used in order to countercheck if a network traffic that is not detected by Signature-based IDS is a true malicious traffic or not. In doing so, the Anomaly-based IDS might come up with several numbers of logs containing numerous network attacks which could possibly be a false positive. This is the reason why the Anomaly-based IDS is not perfect, it would readily alarm the system that a network traffic is an attack just because it is not on its baseline. In order to resolve the problem between these two IDSs, the goal is to correlate data between the logs of the Anomaly-based IDS and the packet that has been captured in order to determine if a network traffic is really malicious or not. With the supervision of a security expert, the malicious network traffic would be verified as malicious. Using machine learning, the researchers can identify which algorithm is better than the other algorithms in classifying if a certain network traffic is really malicious. Upon doing so, the creation of signatures would follow by basing the automated creation of signatures from the detected malicious traffic.

  5. StakeSource: harnessing the power of crowdsourcing and social networks in stakeholder analysis

    OpenAIRE

    Lim, S L; Quercia, D.; Finkelstein, A.

    2010-01-01

    Projects often fail because they overlook stakeholders. Unfortunately, existing stakeholder analysis tools only capture stakeholders' information, relying on experts to manually identify them. StakeSource is a web-based tool that automates stakeholder analysis. It "crowdsources" the stakeholders themselves for recommendations about other stakeholders and aggregates their answers using social network analysis.

  6. NET-2 Network Analysis Program

    International Nuclear Information System (INIS)

    The NET-2 Network Analysis Program is a general purpose digital computer program which solves the nonlinear time domain response and the linearized small signal frequency domain response of an arbitrary network of interconnected components. NET-2 is capable of handling a variety of components and has been applied to problems in several engineering fields, including electronic circuit design and analysis, missile flight simulation, control systems, heat flow, fluid flow, mechanical systems, structural dynamics, digital logic, communications network design, solid state device physics, fluidic systems, and nuclear vulnerability due to blast, thermal, gamma radiation, neutron damage, and EMP effects. Network components may be selected from a repertoire of built-in models or they may be constructed by the user through appropriate combinations of mathematical, empirical, and topological functions. Higher-level components may be defined by subnetworks composed of any combination of user-defined components and built-in models. The program provides a modeling capability to represent and intermix system components on many levels, e.g., from hole and electron spatial charge distributions in solid state devices through discrete and integrated electronic components to functional system blocks. NET-2 is capable of simultaneous computation in both the time and frequency domain, and has statistical and optimization capability. Network topology may be controlled as a function of the network solution. (U.S.)

  7. PAPNET TM: an automated cytology screener using image processing and neural networks

    Science.gov (United States)

    Luck, Randall L.; Tjon-Fo-Sang, Robert; Mango, Laurie; Recht, Joel R.; Lin, Eunice; Knapp, James

    1992-04-01

    The Pap smear is the universally accepted test used for cervical cancer screening. In the United States alone, about 50 to 70 million of these test are done annually. Every one of the tests is done manually be a cytotechnologist looking at cells on a glass slide under a microscope. This paper describes PAPNET, an automated microscope system that combines a high speed image processor and a neural network processor. The image processor performs an algorithmic primary screen of each image. The neural network performs a non-algorithmic secondary classification of candidate cells. The final output of the system is not a diagnosis. Rather it is a display screen of suspicious cells from which a decision about the status of the case can be made.

  8. Automated High-Dimensional Flow Cytometric Data Analysis

    Science.gov (United States)

    Pyne, Saumyadipta; Hu, Xinli; Wang, Kui; Rossin, Elizabeth; Lin, Tsung-I.; Maier, Lisa; Baecher-Allan, Clare; McLachlan, Geoffrey; Tamayo, Pablo; Hafler, David; de Jager, Philip; Mesirov, Jill

    Flow cytometry is widely used for single cell interrogation of surface and intracellular protein expression by measuring fluorescence intensity of fluorophore-conjugated reagents. We focus on the recently developed procedure of Pyne et al. (2009, Proceedings of the National Academy of Sciences USA 106, 8519-8524) for automated high- dimensional flow cytometric analysis called FLAME (FLow analysis with Automated Multivariate Estimation). It introduced novel finite mixture models of heavy-tailed and asymmetric distributions to identify and model cell populations in a flow cytometric sample. This approach robustly addresses the complexities of flow data without the need for transformation or projection to lower dimensions. It also addresses the critical task of matching cell populations across samples that enables downstream analysis. It thus facilitates application of flow cytometry to new biological and clinical problems. To facilitate pipelining with standard bioinformatic applications such as high-dimensional visualization, subject classification or outcome prediction, FLAME has been incorporated with the GenePattern package of the Broad Institute. Thereby analysis of flow data can be approached similarly as other genomic platforms. We also consider some new work that proposes a rigorous and robust solution to the registration problem by a multi-level approach that allows us to model and register cell populations simultaneously across a cohort of high-dimensional flow samples. This new approach is called JCM (Joint Clustering and Matching). It enables direct and rigorous comparisons across different time points or phenotypes in a complex biological study as well as for classification of new patient samples in a more clinical setting.

  9. Fully automated diabetic retinopathy screening using morphological component analysis.

    Science.gov (United States)

    Imani, Elaheh; Pourreza, Hamid-Reza; Banaee, Touka

    2015-07-01

    Diabetic retinopathy is the major cause of blindness in the world. It has been shown that early diagnosis can play a major role in prevention of visual loss and blindness. This diagnosis can be made through regular screening and timely treatment. Besides, automation of this process can significantly reduce the work of ophthalmologists and alleviate inter and intra observer variability. This paper provides a fully automated diabetic retinopathy screening system with the ability of retinal image quality assessment. The novelty of the proposed method lies in the use of Morphological Component Analysis (MCA) algorithm to discriminate between normal and pathological retinal structures. To this end, first a pre-screening algorithm is used to assess the quality of retinal images. If the quality of the image is not satisfactory, it is examined by an ophthalmologist and must be recaptured if necessary. Otherwise, the image is processed for diabetic retinopathy detection. In this stage, normal and pathological structures of the retinal image are separated by MCA algorithm. Finally, the normal and abnormal retinal images are distinguished by statistical features of the retinal lesions. Our proposed system achieved 92.01% sensitivity and 95.45% specificity on the Messidor dataset which is a remarkable result in comparison with previous work. PMID:25863517

  10. Automated retinal image analysis for diabetic retinopathy in telemedicine.

    Science.gov (United States)

    Sim, Dawn A; Keane, Pearse A; Tufail, Adnan; Egan, Catherine A; Aiello, Lloyd Paul; Silva, Paolo S

    2015-03-01

    There will be an estimated 552 million persons with diabetes globally by the year 2030. Over half of these individuals will develop diabetic retinopathy, representing a nearly insurmountable burden for providing diabetes eye care. Telemedicine programmes have the capability to distribute quality eye care to virtually any location and address the lack of access to ophthalmic services. In most programmes, there is currently a heavy reliance on specially trained retinal image graders, a resource in short supply worldwide. These factors necessitate an image grading automation process to increase the speed of retinal image evaluation while maintaining accuracy and cost effectiveness. Several automatic retinal image analysis systems designed for use in telemedicine have recently become commercially available. Such systems have the potential to substantially improve the manner by which diabetes eye care is delivered by providing automated real-time evaluation to expedite diagnosis and referral if required. Furthermore, integration with electronic medical records may allow a more accurate prognostication for individual patients and may provide predictive modelling of medical risk factors based on broad population data. PMID:25697773

  11. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can i...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies.......networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...

  12. Software fault tree analysis of an automated control system device written in Ada

    OpenAIRE

    Winter, Mathias William.

    1995-01-01

    Software Fault Tree Analysis (SFTA) is a technique used to analyze software for faults that could lead to hazardous conditions in systems which contain software components. Previous thesis works have developed three Ada-based, semi-automated software analysis tools, the Automated Code Translation Tool (ACm) an Ada statement template generator, the Fault Tree Editor (Fm) a graphical fault tree editor, and the Fault Isolator (Fl) an automated software fault tree isolator. These previous works d...

  13. An Automated Artificial Neural Network System for Land Use/Land Cover Classification from Landsat TM Imagery

    Directory of Open Access Journals (Sweden)

    Siamak Khorram

    2009-07-01

    Full Text Available This paper focuses on an automated ANN classification system consisting of two modules: an unsupervised Kohonen’s Self-Organizing Mapping (SOM neural network module, and a supervised Multilayer Perceptron (MLP neural network module using the Backpropagation (BP training algorithm. Two training algorithms were provided for the SOM network module: the standard SOM, and a refined SOM learning algorithm which incorporated Simulated Annealing (SA. The ability of our automated ANN system to perform Land-Use/Land-Cover (LU/LC classifications of a Landsat Thematic Mapper (TM image was tested using a supervised MLP network, an unsupervised SOM network, and a combination of SOM with SA network. Our case study demonstrated that the ANN classification system fulfilled the tasks of network training pattern creation, network training, and network generalization. The results from the three networks were assessed via a comparison with reference data derived from the high spatial resolution Digital Colour Infrared (CIR Digital Orthophoto Quarter Quad (DOQQ data. The supervised MLP network obtained the most accurate classification accuracy as compared to the two unsupervised SOM networks. Additionally, the classification performance of the refined SOM network was found to be significantly better than that of the standard SOM network essentially due to the incorporation of SA. This is mainly due to the SA-assisted classification utilizing the scheduling cooling scheme. It is concluded that our automated ANN classification system can be utilized for LU/LC applications and will be particularly useful when traditional statistical classification methods are not suitable due to a statistically abnormal distribution of the input data.

  14. Traffic distribution and network capacity analysis in social opportunistic networks

    OpenAIRE

    Soelistijanto, B; Howarth, MP

    2012-01-01

    Social opportunistic networks are intermittently connected mobile ad hoc networks (ICNs) that exploit human mobility to physically carry messages between disconnected parts of the network. Human mobility thus plays an essential role in the performance of forwarding protocols in the networks, and people's movements are in turn affected by their social interactions with each other. In this paper we present an analysis of the traffic distribution among the nodes of social opportunistic networks ...

  15. Malaria: the value of the automated depolarization analysis.

    Science.gov (United States)

    Josephine, F P; Nissapatorn, V

    2005-01-01

    This retrospective and descriptive study was carried out in the University of Malaya Medical Center (UMMC) from January to September, 2004. This study aimed to evaluate the diagnostic utility of the Cell-Dyn 4000 hematology analyzer's depolarization analysis and to determine the sensitivity and specificity of this technique in the context of malaria diagnosis. A total of 889 cases presenting with pyrexia of unknown origin or clinically suspected of malaria were examined. Sixteen of these blood samples were found to be positive; 12 for P. vivax, 3 for P. malariae, and 1 for P. falciparum by peripheral blood smear as the standard technique for parasite detection and species identification. Demographic characteristics showed that the majority of patients were in the age range of 20-57 with a mean of 35.9 (+/- SD) 11.4 years, and male foreign workers. Of these, 16 positive blood samples were also processed by Cell-Dyne 4000 analyzer in the normal complete blood count (CBC) operational mode. Malaria parasites produce hemozoin, which depolarizes light and this allows the automated detection of malaria during routine complete blood count analysis with the Abbot Cell-Dyn CD4000 instrument. The white blood cell (WBC) differential plots of all malaria positive samples showed abnormal depolarization events in the NEU-EOS and EOS I plots. This was not seen in the negative samples. In 12 patients with P. vivax infection, a cluster pattern in the Neu-EOS and EOS I plots was observed, and appeared color-coded green or black. In 3 patients with P. malariae infection, few random depolarization events in the NEU-EOS and EOS I plots were seen, and appeared color-coded green, black or blue. While in the patient with P. falciparum infection, the sample was color-coded green with a few random purple depolarizing events in the NEU-EOS and EOS I plots. This study confirms that automated depolarization analysis is a highly sensitive and specific method to diagnose whether or not a patient

  16. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  17. DEEP-South: Automated Observation Scheduling, Data Reduction and Analysis Software Subsystem

    Science.gov (United States)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Young-Ho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; Park, Jintae; Moon, Bora

    2016-01-01

    We started `DEep Ecliptic Patrol of the Southern sky' (DEEP-South, DS) (Moon et al. 2015) in late 2012, and conducted test runs with the first Korea Microlensing Telescope Network (KMTNet) (Park et al. 2012), a 1.6 m telescope with 18k x 18k CCD stationed at CTIO in early 2015. While the primary objective of DEEP-South is the physical characterization of small Solar System bodies, it is also expected to discover a large number of such bodies, many of them previously unknown. An automated observation scheduling, data reduction and analysis software subsystem called `DEEP-South Scheduling and Data reduction System' (DS SDS) is thus being designed and implemented to enable observation planning, data reduction and analysis with minimal human intervention.

  18. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  19. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    This paper presents the development of an automated generation of a new burnup chain for reactor analysis applications. The JENDL FP Decay Data File 2011 and Fission Yields Data File 2011 were used as the data sources. The nuclides in the new chain are determined by restrictions of the half-life and cumulative yield of fission products or from a given list. Then, decay modes, branching ratios and fission yields are recalculated taking into account intermediate reactions. The new burnup chain is output according to the format for the SRAC code system. Verification was performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Further development and applications are being planned with the burnup chain code. (author)

  20. Analysis of Automated Aircraft Conflict Resolution and Weather Avoidance

    Science.gov (United States)

    Love, John F.; Chan, William N.; Lee, Chu Han

    2009-01-01

    This paper describes an analysis of using trajectory-based automation to resolve both aircraft and weather constraints for near-term air traffic management decision making. The auto resolution algorithm developed and tested at NASA-Ames to resolve aircraft to aircraft conflicts has been modified to mitigate convective weather constraints. Modifications include adding information about the size of a gap between weather constraints to the routing solution. Routes that traverse gaps that are smaller than a specific size are not used. An evaluation of the performance of the modified autoresolver to resolve both conflicts with aircraft and weather was performed. Integration with the Center-TRACON Traffic Management System was completed to evaluate the effect of weather routing on schedule delays.

  1. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  2. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  3. Automated Large-Scale Shoreline Variability Analysis From Video

    Science.gov (United States)

    Pearre, N. S.

    2006-12-01

    Land-based video has been used to quantify changes in nearshore conditions for over twenty years. By combining the ability to track rapid, short-term shoreline change and changes associated with longer term or seasonal processes, video has proved to be a cost effective and versatile tool for coastal science. Previous video-based studies of shoreline change have typically examined the position of the shoreline along a small number of cross-shore lines as a proxy for the continuous coast. The goal of this study is twofold: (1) to further develop automated shoreline extraction algorithms for continuous shorelines, and (2) to track the evolution of a nourishment project at Rehoboth Beach, DE that was concluded in June 2005. Seven cameras are situated approximately 30 meters above mean sea level and 70 meters from the shoreline. Time exposure and variance images are captured hourly during daylight and transferred to a local processing computer. After correcting for lens distortion and geo-rectifying to a shore-normal coordinate system, the images are merged to form a composite planform image of 6 km of coast. Automated extraction algorithms establish shoreline and breaker positions throughout a tidal cycle on a daily basis. Short and long term variability in the daily shoreline will be characterized using empirical orthogonal function (EOF) analysis. Periodic sediment volume information will be extracted by incorporating the results of monthly ground-based LIDAR surveys and by correlating the hourly shorelines to the corresponding tide level under conditions with minimal wave activity. The Delaware coast in the area downdrift of the nourishment site is intermittently interrupted by short groins. An Even/Odd analysis of the shoreline response around these groins will be performed. The impact of groins on the sediment volume transport along the coast during periods of accretive and erosive conditions will be discussed. [This work is being supported by DNREC and the

  4. Automated grading of left ventricular segmental wall motion by an artificial neural network using color kinesis images

    Directory of Open Access Journals (Sweden)

    L.O. Murta Jr.

    2006-01-01

    Full Text Available The present study describes an auxiliary tool in the diagnosis of left ventricular (LV segmental wall motion (WM abnormalities based on color-coded echocardiographic WM images. An artificial neural network (ANN was developed and validated for grading LV segmental WM using data from color kinesis (CK images, a technique developed to display the timing and magnitude of global and regional WM in real time. We evaluated 21 normal subjects and 20 patients with LVWM abnormalities revealed by two-dimensional echocardiography. CK images were obtained in two sets of viewing planes. A method was developed to analyze CK images, providing quantitation of fractional area change in each of the 16 LV segments. Two experienced observers analyzed LVWM from two-dimensional images and scored them as: 1 normal, 2 mild hypokinesia, 3 moderate hypokinesia, 4 severe hypokinesia, 5 akinesia, and 6 dyskinesia. Based on expert analysis of 10 normal subjects and 10 patients, we trained a multilayer perceptron ANN using a back-propagation algorithm to provide automated grading of LVWM, and this ANN was then tested in the remaining subjects. Excellent concordance between expert and ANN analysis was shown by ROC curve analysis, with measured area under the curve of 0.975. An excellent correlation was also obtained for global LV segmental WM index by expert and ANN analysis (R² = 0.99. In conclusion, ANN showed high accuracy for automated semi-quantitative grading of WM based on CK images. This technique can be an important aid, improving diagnostic accuracy and reducing inter-observer variability in scoring segmental LVWM.

  5. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social and...... policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs), and...... stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  6. Automated analysis of dUT1 with VieVS using new post-earthquake coordinates for Tsukuba

    Science.gov (United States)

    Kareinen, N.; Uunila, M.

    2013-08-01

    The automated analysis of dUT1 from intensive sessions performed in Aalto University Metsähovi Radio Observatory include IVS-INT2 and IVS-INT3 sessions. These sessions are sensitive to a priori positions of the stations due to the small number of baselines. We analyze IVS-R1 sessions to estimate the new a priori coordinates for Tsukuba affected by the March 2011 Tohoku Earthquake in order to include IVS-INT2 and to improve the accuracy of IVS-INT3 sessions in the analysis. The procedure for utilising new a priori coordinates is automated and included in the dUT1 analysis. It can be utilised in case of another event distrupting the stations in the observation network.

  7. Transmission analysis in WDM networks

    DEFF Research Database (Denmark)

    Rasmussen, Christian Jørgen

    1999-01-01

    This thesis describes the development of a computer-based simulator for transmission analysis in optical wavelength division multiplexing networks. A great part of the work concerns fundamental optical network simulator issues. Among these issues are identification of the versatility and user...... the different component models are invoked during the simulation of a system. A simple set of rules which makes it possible to simulate any network architectures is laid down. The modelling of the nonlinear fibre and the optical receiver is also treated. The work on the fibre concerns the numerical solution......-friendliness demands which such a simulator must meet, development of the "spectral window representation" for representation of the optical signals and finding an effective way of handling the optical signals in the computer memory. One important issue more is the rules for the determination of the order in which...

  8. Automated seismic event location by waveform coherence analysis

    OpenAIRE

    Grigoli, Francesco

    2014-01-01

    Automated location of seismic events is a very important task in microseismic monitoring operations as well for local and regional seismic monitoring. Since microseismic records are generally characterised by low signal-to-noise ratio, such methods are requested to be noise robust and sufficiently accurate. Most of the standard automated location routines are based on the automated picking, identification and association of the first arrivals of P and S waves and on the minimization of the re...

  9. The Development of the Routing Pattern of the Backbone Data Transmission Network for the Automation of the Krasnoyarsk Railway

    Directory of Open Access Journals (Sweden)

    Sergey Victorovich Makarov

    2016-06-01

    Full Text Available The paper deals with the data transmission network of the Krasnoyarsk Railway, its structure, the topology of data transmission and the routing protocol, which supports its operation, as well as the specifics of data transmission networking. The combination of the railway automation applications and the data transmission network make up the automation systems, making it possible to improve performance, increase the freight traffic volume, and improve the quality of passenger service. The objective of this paper is to study the existing data transmission network of the Krasnoyarsk Railway and to develop ways of its modernization, in order to improve the reliability of the network and the automated systems that use this network. It was found that the IS-IS and OSPF routing protocols have many differences, primarily due to the fact that the IS-IS protocol does not use IP addresses as paths. On the one hand, it makes it possible to use the IS-IS protocol for the operation in IPv6 networks, whereas OSPF version 2 doesn’t provide this opportunity; therefore, OSPF version 3 was developed to solve this problem. However, on the other hand, in case of using IPv4, the routing configuration by means of the IS-IS protocol will imply the need to study a significant volume of information and use unusual methods of routing configuration.

  10. Granulometric profiling of aeolian dust deposits by automated image analysis

    Science.gov (United States)

    Varga, György; Újvári, Gábor; Kovács, János; Jakab, Gergely; Kiss, Klaudia; Szalai, Zoltán

    2016-04-01

    Determination of granulometric parameters is of growing interest in the Earth sciences. Particle size data of sedimentary deposits provide insights into the physicochemical environment of transport, accumulation and post-depositional alterations of sedimentary particles, and are important proxies applied in paleoclimatic reconstructions. It is especially true for aeolian dust deposits with a fairly narrow grain size range as a consequence of the extremely selective nature of wind sediment transport. Therefore, various aspects of aeolian sedimentation (wind strength, distance to source(s), possible secondary source regions and modes of sedimentation and transport) can be reconstructed only from precise grain size data. As terrestrial wind-blown deposits are among the most important archives of past environmental changes, proper explanation of the proxy data is a mandatory issue. Automated imaging provides a unique technique to gather direct information on granulometric characteristics of sedimentary particles. Granulometric data obtained from automatic image analysis of Malvern Morphologi G3-ID is a rarely applied new technique for particle size and shape analyses in sedimentary geology. Size and shape data of several hundred thousand (or even million) individual particles were automatically recorded in this study from 15 loess and paleosoil samples from the captured high-resolution images. Several size (e.g. circle-equivalent diameter, major axis, length, width, area) and shape parameters (e.g. elongation, circularity, convexity) were calculated by the instrument software. At the same time, the mean light intensity after transmission through each particle is automatically collected by the system as a proxy of optical properties of the material. Intensity values are dependent on chemical composition and/or thickness of the particles. The results of the automated imaging were compared to particle size data determined by three different laser diffraction instruments

  11. BitTorrent Swarm Analysis through Automation and Enhanced Logging

    OpenAIRE

    R˘azvan Deaconescu; Marius Sandu-Popa; Adriana Dr˘aghici; Nicolae T˘apus

    2011-01-01

    Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet, with BitTorrent, the most popular protocol for content distribution, as its flagship. A high number of studies and investigations have been undertaken to measure, analyse and improve the inner workings of the BitTorrent protocol. Approaches such as tracker message analysis, network probing and packet sniffing have been deployed to understand and enhance BitTorrent's internal behaviour. In this paper we...

  12. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  13. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  14. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  15. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  16. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  17. A fully automated entanglement-based quantum cryptography system for telecom fiber networks

    International Nuclear Information System (INIS)

    We present in this paper a quantum key distribution (QKD) system based on polarization entanglement for use in telecom fibers. A QKD exchange up to 50 km was demonstrated in the laboratory with a secure key rate of 550 bits s-1. The system is compact and portable with a fully automated start-up, and stabilization modules for polarization, synchronization and photon coupling allow hands-off operation. Stable and reliable key exchange in a deployed optical fiber of 16 km length was demonstrated. In this fiber network, we achieved over 2 weeks an automatic key generation with an average key rate of 2000 bits s-1 without manual intervention. During this period, the system had an average entanglement visibility of 93%, highlighting the technical level and stability achieved for entanglement-based quantum cryptography.

  18. Neural Network Approach to Automated Condition Classification of a Check Valve by Acoustic Emission Signals

    International Nuclear Information System (INIS)

    This paper presents new techniques under development for monitoring the health and vibration of the active components in nuclear power plants, The purpose of this study is to develop an automated system for condition classification of a check valve one of the components being used extensively in a safety system of a nuclear power plant. Acoustic emission testing for a check valve under controlled flow loop conditions was performed to detect and evaluate disc movement for valve failure such as wear and leakage due to foreign object interference in a check valve, It is clearly demonstrated that the evaluation of different types of failure types such as disc wear and check valve leakage were successful by systematically analyzing the characteristics of various AE parameters, It is also shown that the leak size can be determined with an artificial neural network

  19. COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks

    NARCIS (Netherlands)

    Sie, Rory

    2012-01-01

    Sie, R. L. L. (2012). COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks (Unpublished doctoral dissertation). September, 28, 2012, Open Universiteit in the Netherlands (CELSTEC), Heerlen, The Netherlands.

  20. Hydraulic Modeling: Pipe Network Analysis

    OpenAIRE

    Datwyler, Trevor T.

    2012-01-01

    Water modeling is becoming an increasingly important part of hydraulic engineering. One application of hydraulic modeling is pipe network analysis. Using programmed algorithms to repeatedly solve continuity and energy equations, computer software can greatly reduce the amount of time required to analyze a closed conduit system. Such hydraulic models can become a valuable tool for cities to maintain their water systems and plan for future growth. The Utah Division of Drinking Water regulations...

  1. Analysis of Semantic Networks using Complex Networks Concepts

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2013-01-01

    In this paper we perform a preliminary analysis of semantic networks to determine the most important terms that could be used to optimize a summarization task. In our experiments, we measure how the properties of a semantic network change, when the terms in the network are removed. Our preliminary...... results indicate that this approach provides good results on the semantic network analyzed in this paper....

  2. Automated radial basis function neural network based image classification system for diabetic retinopathy detection in retinal images

    Science.gov (United States)

    Anitha, J.; Vijila, C. Kezi Selva; Hemanth, D. Jude

    2010-02-01

    Diabetic retinopathy (DR) is a chronic eye disease for which early detection is highly essential to avoid any fatal results. Image processing of retinal images emerge as a feasible tool for this early diagnosis. Digital image processing techniques involve image classification which is a significant technique to detect the abnormality in the eye. Various automated classification systems have been developed in the recent years but most of them lack high classification accuracy. Artificial neural networks are the widely preferred artificial intelligence technique since it yields superior results in terms of classification accuracy. In this work, Radial Basis function (RBF) neural network based bi-level classification system is proposed to differentiate abnormal DR Images and normal retinal images. The results are analyzed in terms of classification accuracy, sensitivity and specificity. A comparative analysis is performed with the results of the probabilistic classifier namely Bayesian classifier to show the superior nature of neural classifier. Experimental results show promising results for the neural classifier in terms of the performance measures.

  3. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  4. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  5. Automated Imaging and Analysis of the Hemagglutination Inhibition Assay.

    Science.gov (United States)

    Nguyen, Michael; Fries, Katherine; Khoury, Rawia; Zheng, Lingyi; Hu, Branda; Hildreth, Stephen W; Parkhill, Robert; Warren, William

    2016-04-01

    The hemagglutination inhibition (HAI) assay quantifies the level of strain-specific influenza virus antibody present in serum and is the standard by which influenza vaccine immunogenicity is measured. The HAI assay endpoint requires real-time monitoring of rapidly evolving red blood cell (RBC) patterns for signs of agglutination at a rate of potentially thousands of patterns per day to meet the throughput needs for clinical testing. This analysis is typically performed manually through visual inspection by highly trained individuals. However, concordant HAI results across different labs are challenging to demonstrate due to analyst bias and variability in analysis methods. To address these issues, we have developed a bench-top, standalone, high-throughput imaging solution that automatically determines the agglutination states of up to 9600 HAI assay wells per hour and assigns HAI titers to 400 samples in a single unattended 30-min run. Images of the tilted plates are acquired as a function of time and analyzed using algorithms that were developed through comprehensive examination of manual classifications. Concordance testing of the imaging system with eight different influenza antigens demonstrates 100% agreement between automated and manual titer determination with a percent difference of ≤3.4% for all cases. PMID:26464422

  6. Automated target recognition technique for image segmentation and scene analysis

    Science.gov (United States)

    Baumgart, Chris W.; Ciarcia, Christopher A.

    1994-03-01

    Automated target recognition (ATR) software has been designed to perform image segmentation and scene analysis. Specifically, this software was developed as a package for the Army's Minefield and Reconnaissance and Detector (MIRADOR) program. MIRADOR is an on/off road, remote control, multisensor system designed to detect buried and surface- emplaced metallic and nonmetallic antitank mines. The basic requirements for this ATR software were the following: (1) an ability to separate target objects from the background in low signal-noise conditions; (2) an ability to handle a relatively high dynamic range in imaging light levels; (3) the ability to compensate for or remove light source effects such as shadows; and (4) the ability to identify target objects as mines. The image segmentation and target evaluation was performed using an integrated and parallel processing approach. Three basic techniques (texture analysis, edge enhancement, and contrast enhancement) were used collectively to extract all potential mine target shapes from the basic image. Target evaluation was then performed using a combination of size, geometrical, and fractal characteristics, which resulted in a calculated probability for each target shape. Overall results with this algorithm were quite good, though there is a tradeoff between detection confidence and the number of false alarms. This technology also has applications in the areas of hazardous waste site remediation, archaeology, and law enforcement.

  7. Spectral analysis for automated exploration and sample acquisition

    Science.gov (United States)

    Eberlein, Susan; Yates, Gigi

    1992-05-01

    Future space exploration missions will rely heavily on the use of complex instrument data for determining the geologic, chemical, and elemental character of planetary surfaces. One important instrument is the imaging spectrometer, which collects complete images in multiple discrete wavelengths in the visible and infrared regions of the spectrum. Extensive computational effort is required to extract information from such high-dimensional data. A hierarchical classification scheme allows multispectral data to be analyzed for purposes of mineral classification while limiting the overall computational requirements. The hierarchical classifier exploits the tunability of a new type of imaging spectrometer which is based on an acousto-optic tunable filter. This spectrometer collects a complete image in each wavelength passband without spatial scanning. It may be programmed to scan through a range of wavelengths or to collect only specific bands for data analysis. Spectral classification activities employ artificial neural networks, trained to recognize a number of mineral classes. Analysis of the trained networks has proven useful in determining which subsets of spectral bands should be employed at each step of the hierarchical classifier. The network classifiers are capable of recognizing all mineral types which were included in the training set. In addition, the major components of many mineral mixtures can also be recognized. This capability may prove useful for a system designed to evaluate data in a strange environment where details of the mineral composition are not known in advance.

  8. Networks and network analysis for defence and security

    CERN Document Server

    Masys, Anthony J

    2014-01-01

    Networks and Network Analysis for Defence and Security discusses relevant theoretical frameworks and applications of network analysis in support of the defence and security domains. This book details real world applications of network analysis to support defence and security. Shocks to regional, national and global systems stemming from natural hazards, acts of armed violence, terrorism and serious and organized crime have significant defence and security implications. Today, nations face an uncertain and complex security landscape in which threats impact/target the physical, social, economic

  9. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may...

  10. AUTOMATED SOLID PHASE EXTRACTION GC/MS FOR ANALYSIS OF SEMIVOLATILES IN WATER AND SEDIMENTS

    Science.gov (United States)

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line sampl...

  11. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander; Jernigan, Terry L; Madsen, Kathrine Skak; Baaré, William F C

    2014-01-01

    inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained with...

  12. Automated absolute activation analysis with californium-252 sources

    Energy Technology Data Exchange (ETDEWEB)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg /sup 252/Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from /sup 17/O. Detection sensitivities of < or = 400 ppB for natural uranium and 8 ppB (< or = 0.5 (nCi/g)) for /sup 239/Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level.

  13. Automated absolute activation analysis with californium-252 sources

    International Nuclear Information System (INIS)

    A 100-mg 252Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from 17O. Detection sensitivities of 239Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level

  14. MORPHY, a program for an automated "atoms in molecules" analysis

    Science.gov (United States)

    Popelier, Paul L. A.

    1996-02-01

    The operating manual for a structured FORTAN 77 program called MORPHY is presented. This code performs an automated topological analysis of a molecular electron density and its Laplacian. The program is written in a stylistically homogeneous, transparant and modular manner. The input is compact but flexible and allows for multiple jobs in one deck. The output is detailed and has an attractive lay-out. Critical points in the charge density and its Laplacian can be located in a robust and economic way and are displayed via an external on-line visualisation package. The gradient vector field of the charge density can be traced with great accuracy, planar contour, relief and one-dimensional line plots of many scalar properties can be generated. Non-bonded radii are calculated and analytical expressions for interatomic surfaces are computed (with error estimates) and plotted. MORPHY is interfaced with the AIMPAC suite of programs. The capabilities of the program are illustrated with two test runs and five selected figures.

  15. Automating dChip: toward reproducible sharing of microarray data analysis

    Directory of Open Access Journals (Sweden)

    Li Cheng

    2008-05-01

    Full Text Available Abstract Background During the past decade, many software packages have been developed for analysis and visualization of various types of microarrays. We have developed and maintained the widely used dChip as a microarray analysis software package accessible to both biologist and data analysts. However, challenges arise when dChip users want to analyze large number of arrays automatically and share data analysis procedures and parameters. Improvement is also needed when the dChip user support team tries to identify the causes of reported analysis errors or bugs from users. Results We report here implementation and application of the dChip automation module. Through this module, dChip automation files can be created to include menu steps, parameters, and data viewpoints to run automatically. A data-packaging function allows convenient transfer from one user to another of the dChip software, microarray data, and analysis procedures, so that the second user can reproduce the entire analysis session of the first user. An analysis report file can also be generated during an automated run, including analysis logs, user comments, and viewpoint screenshots. Conclusion The dChip automation module is a step toward reproducible research, and it can prompt a more convenient and reproducible mechanism for sharing microarray software, data, and analysis procedures and results. Automation data packages can also be used as publication supplements. Similar automation mechanisms could be valuable to the research community if implemented in other genomics and bioinformatics software packages.

  16. Structural Analysis of Complex Networks

    CERN Document Server

    Dehmer, Matthias

    2011-01-01

    Filling a gap in literature, this self-contained book presents theoretical and application-oriented results that allow for a structural exploration of complex networks. The work focuses not only on classical graph-theoretic methods, but also demonstrates the usefulness of structural graph theory as a tool for solving interdisciplinary problems. Applications to biology, chemistry, linguistics, and data analysis are emphasized. The book is suitable for a broad, interdisciplinary readership of researchers, practitioners, and graduate students in discrete mathematics, statistics, computer science,

  17. Moving Toward an Optimal and Automated Geospatial Network for CCUS Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Brendan Arthur [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-05

    Modifications in the global climate are being driven by the anthropogenic release of greenhouse gases (GHG) including carbon dioxide (CO2) (Middleton et al. 2014). CO2 emissions have, for example, been directly linked to an increase in total global temperature (Seneviratne et al. 2016). Strategies that limit CO2 emissions—like CO2 capture, utilization, and storage (CCUS) technology—can greatly reduce emissions by capturing CO2 before it is released to the atmosphere. However, to date CCUS technology has not been developed at a large commercial scale despite several promising high profile demonstration projects (Middleton et al. 2015). Current CCUS research has often focused on capturing CO2 emissions from coal-fired power plants, but recent research at Los Alamos National Laboratory (LANL) suggests focusing CCUS CO2 capture research upon industrial sources might better encourage CCUS deployment. To further promote industrial CCUS deployment, this project builds off current LANL research by continuing the development of a software tool called SimCCS, which estimates a regional system of transport to inject CO2 into sedimentary basins. The goal of SimCCS, which was first developed by Middleton and Bielicki (2009), is to output an automated and optimal geospatial industrial CCUS pipeline that accounts for industrial source and sink locations by estimating a Delaunay triangle network which also minimizes topographic and social costs (Middleton and Bielicki 2009). Current development of SimCCS is focused on creating a new version that accounts for spatial arrangements that were not available in the previous version. This project specifically addresses the issue of non-unique Delaunay triangles by adding additional triangles to the network, which can affect how the CCUS network is calculated.

  18. Privacy Analysis in Mobile Social Networks

    DEFF Research Database (Denmark)

    Sapuppo, Antonio

    2012-01-01

    Nowadays, mobile social networks are capable of promoting social networking benefits during physical meetings, in order to leverage interpersonal affinities not only among acquaintances, but also between strangers. Due to their foundation on automated sharing of personal data in the physical...... factors: inquirer, purpose of disclosure, access & control of the disclosed information, location familiarity and current activity of the user. This research can serve as relevant input for the design of privacy management models in mobile social networks....... surroundings of the user, these networks are subject to crucial privacy threats. Privacy management systems must be capable of accurate selection of data disclosure according to human data sensitivity evaluation. Therefore, it is crucial to research and comprehend an individual's personal information...

  19. Topological Analysis of Urban Drainage Networks

    Science.gov (United States)

    Yang, Soohyun; Paik, Kyungrock; McGrath, Gavan; Rao, Suresh

    2016-04-01

    Urban drainage networks are an essential component of infrastructure, and comprise the aggregation of underground pipe networks carrying storm water and domestic waste water for eventual discharge to natural stream networks. Growing urbanization has contributed to rapid expansion of sewer networks, vastly increasing their complexity and scale. Importance of sewer networks has been well studied from an engineering perspective, including resilient management, optimal design, and malfunctioning impact. Yet, analysis of the urban drainage networks using complex networks approach are lacking. Urban drainage networks consist of manholes and conduits, which correspond to nodes and edges, analogous to junctions and streams in river networks. Converging water flows in these two networks are driven by elevation gradient. In this sense, engineered urban drainage networks share several attributes of flows in river networks. These similarities between the two directed, converging flow networks serve the basis for us to hypothesize that the functional topology of sewer networks, like river networks, is scale-invariant. We analyzed the exceedance probability distribution of upstream area for practical sewer networks in South Korea. We found that the exceedance probability distributions of upstream area follow power-law, implying that the sewer networks exhibit topological self-similarity. The power-law exponents for the sewer networks were similar, and within the range reported from analysis of natural river networks. Thus, in line with our hypothesis, these results suggest that engineered urban drainage networks share functional topological attributes regardless of their structural dissimilarity or different underlying network evolution processes (natural vs. engineered). Implications of these findings for optimal design of sewer networks and for modeling sewer flows will be discussed.

  20. Bittorrent Swarm Analysis Through Automation and Enhanced Logging

    Directory of Open Access Journals (Sweden)

    R˘azvan Deaconescu

    2011-01-01

    Full Text Available Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet,with BitTorrent, the most popular protocol for content distribution, as its flagship.A high number ofstudies and investigations have been undertaken to measure, analyse and improve the inner workings ofthe BitTorrent protocol. Approaches such as tracker message analysis, network probing and packetsniffing have been deployed to understand and enhance BitTorrent’s internal behaviour. In this paper wepresent a novel approach that aims to collect, process and analyse large amounts of local peerinformation in BitTorrent swarms. We classify the information as periodic status information able to bemonitored in real time and as verbose logging information to be used for subsequent analysis. We havedesigned and implemented a retrieval, storage and presentation infrastructure that enables easy analysisof BitTorrent protocol internals. Our approach can be employed both as a comparison tool, as well as ameasurement system of how network characteristics and protocol implementation influence the overallBitTorrent swarm performance.We base our approach on a framework that allows easy swarm creationand control for different BitTorrent clients.With the help of a virtualized infrastructure and a client-serversoftware layer we are able to create, command and manage large sized BitTorrent swarms. Theframework allows a user to run, schedule, start, stop clients within a swarm and collect informationregarding their behavior.

  1. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This...

  2. Automated Design and Analysis Tool for CEV Structural and TPS Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  3. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction and...... preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  4. Google matrix analysis of directed networks

    Science.gov (United States)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2015-10-01

    In the past decade modern societies have developed enormous communication and social networks. Their classification and information retrieval processing has become a formidable task for the society. Because of the rapid growth of the World Wide Web, and social and communication networks, new mathematical methods have been invented to characterize the properties of these networks in a more detailed and precise way. Various search engines extensively use such methods. It is highly important to develop new tools to classify and rank a massive amount of network information in a way that is adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency using various examples including the World Wide Web, Wikipedia, software architectures, world trade, social and citation networks, brain neural networks, DNA sequences, and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chains, quantum chaos, and random matrix theory.

  5. Google matrix analysis of directed networks

    CERN Document Server

    Ermann, Leonardo; Shepelyansky, Dima L

    2014-01-01

    In past ten years, modern societies developed enormous communication and social networks. Their classification and information retrieval processing become a formidable task for the society. Due to the rapid growth of World Wide Web, social and communication networks, new mathematical methods have been invented to characterize the properties of these networks on a more detailed and precise level. Various search engines are essentially using such methods. It is highly important to develop new tools to classify and rank enormous amount of network information in a way adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency on various examples including World Wide Web, Wikipedia, software architecture, world trade, social and citation networks, brain neural networks, DNA sequences and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chain...

  6. ONLINE SOCIAL NETWORK INTERNETWORKING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Bassant E.Youssef

    2014-10-01

    Full Text Available Online social networks (OSNs contain data about users, their relations, interests and daily activities andthe great value of this data results in ever growing popularity of OSNs. There are two types of OSNs data,semantic and topological. Both can be used to support decision making processes in many applicationssuch as in information diffusion, viral marketing and epidemiology. Online Social network analysis (OSNAresearch is used to maximize the benefits gained from OSNs’ data. This paper provides a comprehensive study of OSNs and OSNA to provide analysts with the knowledge needed to analyse OSNs. OSNs’internetworking was found to increase the wealth of the analysed data by depending on more than one OSNas the source of the analysed data. Paper proposes a generic model of OSNs’ internetworking system that an analyst can rely on. Twodifferent data sources in OSNs were identified in our efforts to provide a thorough study of OSNs, whichare the OSN User data and the OSN platform data. Additionally, we propose a classification of the OSNUser data according to its analysis models for different data types to shed some light into the current usedOSNA methodologies. We also highlight the different metrics and parameters that analysts can use toevaluate semantic or topologic OSN user data. Further, we present a classification of the other data typesand OSN platform data that can be used to compare the capabilities of different OSNs whether separate orin a OSNs’ internetworking system. To increase analysts’ awareness about the available tools they can use,we overview some of the currently publically available OSNs’ datasets and simulation tools and identifywhether they are capable of being used in semantic, topological OSNA, or both. The overview identifiesthat only few datasets includes both data types (semantic and topological and there are few analysis toolsthat can perform analysis on both data types. Finally paper present a scenario that

  7. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  8. Automated Production Flow Line Failure Rate Mathematical Analysis with Probability Theory

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-12-01

    Full Text Available Automated lines have been widely used in the industries especially for mass production and to customize product. Productivity of automated line is a crucial indicator to show the output and performance of the production. Failure or breakdown of station or mechanisms is commonly occurs in the automated line in real condition due to the technological and technical problem which is highly affect the productivity. The failure rates of automated line are not express or analyse in terms of mathematic form. This paper presents the mathematic analysis by using probability theory towards the failure condition in automated line. The mathematic express for failure rates can produce and forecast the output of productivity accurately

  9. Automated shock detection and analysis algorithm for space weather application

    Science.gov (United States)

    Vorotnikov, Vasiliy S.; Smith, Charles W.; Hu, Qiang; Szabo, Adam; Skoug, Ruth M.; Cohen, Christina M. S.

    2008-03-01

    Space weather applications have grown steadily as real-time data have become increasingly available. Numerous industrial applications have arisen with safeguarding of the power distribution grids being a particular interest. NASA uses short-term and long-term space weather predictions in its launch facilities. Researchers studying ionospheric, auroral, and magnetospheric disturbances use real-time space weather services to determine launch times. Commercial airlines, communication companies, and the military use space weather measurements to manage their resources and activities. As the effects of solar transients upon the Earth's environment and society grow with the increasing complexity of technology, better tools are needed to monitor and evaluate the characteristics of the incoming disturbances. A need is for automated shock detection and analysis methods that are applicable to in situ measurements upstream of the Earth. Such tools can provide advance warning of approaching disturbances that have significant space weather impacts. Knowledge of the shock strength and speed can also provide insight into the nature of the approaching solar transient prior to arrival at the magnetopause. We report on efforts to develop a tool that can find and analyze shocks in interplanetary plasma data without operator intervention. This method will run with sufficient speed to be a practical space weather tool providing useful shock information within 1 min of having the necessary data to ground. The ability to run without human intervention frees space weather operators to perform other vital services. We describe ways of handling upstream data that minimize the frequency of false positive alerts while providing the most complete description of approaching disturbances that is reasonably possible.

  10. Isochronous wireless network for real-time communication in industrial automation

    CERN Document Server

    Trsek, Henning

    2016-01-01

    This dissertation proposes and investigates an isochronous wireless network for industrial control applications with guaranteed latencies and jitter. Based on a requirements analysis of real industrial applications and the characterisation of the wireless channel, the solution approach is developed. It consists of a TDMA-based medium access control, a dynamic resource allocation and the provision of a global time base for the wired and the wireless network. Due to the global time base, the solution approach allows a seamless and synchronous integration into existing wired Real-time Ethernet systems.

  11. NOA: a novel Network Ontology Analysis method

    OpenAIRE

    Wang, Jiguang; Huang, Qiang; Liu, Zhi-Ping; Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2011-01-01

    Gene ontology analysis has become a popular and important tool in bioinformatics study, and current ontology analyses are mainly conducted in individual gene or a gene list. However, recent molecular network analysis reveals that the same list of genes with different interactions may perform different functions. Therefore, it is necessary to consider molecular interactions to correctly and specifically annotate biological networks. Here, we propose a novel Network Ontology Analysis (NOA) meth...

  12. A simple and robust method for automated photometric classification of supernovae using neural networks

    CERN Document Server

    Karpenka, N V; Hobson, M P

    2012-01-01

    A method is presented for automated photometric classification of supernovae (SNe) as Type-Ia or non-Ia. A two-step approach is adopted in which: (i) the SN lightcurve flux measurements in each observing filter are fitted separately; and (ii) the fitted function parameters and their associated uncertainties, along with the number of flux measurements, the maximum-likelihood value of the fit and Bayesian evidence for the model, are used as the input feature vector to a classification neural network (NN) that outputs the probability that the SN under consideration is of Type-Ia. The method is trained and tested using data released following the SuperNova Photometric Classification Challenge (SNPCC). We consider several random divisions of the data into training and testing sets: for instance, for our sample D_1 (D_4), a total of 10% (40%) of the data are involved in training the algorithm and the remainder used for blind testing of the resulting classifier; we make no selection cuts. Assigning a canonical thres...

  13. BitTorrent Swarm Analysis through Automation and Enhanced Logging

    CERN Document Server

    Deaconescu, Răzvan; Drăghici, Adriana; Tăpus, Nicolae

    2011-01-01

    Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet, with BitTorrent, the most popular protocol for content distribution, as its flagship. A high number of studies and investigations have been undertaken to measure, analyse and improve the inner workings of the BitTorrent protocol. Approaches such as tracker message analysis, network probing and packet sniffing have been deployed to understand and enhance BitTorrent's internal behaviour. In this paper we present a novel approach that aims to collect, process and analyse large amounts of local peer information in BitTorrent swarms. We classify the information as periodic status information able to be monitored in real time and as verbose logging information to be used for subsequent analysis. We have designed and implemented a retrieval, storage and presentation infrastructure that enables easy analysis of BitTorrent protocol internals. Our approach can be employed both as a comparison tool, as well as a measurement syste...

  14. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  15. Quantitive and sociological analysis of blog networks

    CERN Document Server

    Bachnik, W; Leszczynski, P; Podsiadly, R; Rymszewicz, E; Kurylo, L; Makowiecki, D S; Bykowska, B; Bachnik, Wiktor; Szymczyk, Stanislaw; Leszczynski, Piotr; Podsiadlo, Rafal; Rymszewicz, Ewa; Kurylo, Lukasz; Makowiec, Danuta; Bykowska, Beata

    2005-01-01

    This paper examines the emerging phenomenon of blogging, using three different Polish blogging services as the base of the research. Authors show that blog networks are sharing their characteristics with complex networks gamma coefficients, small worlds, cliques, etc.). Elements of sociometric analysis were used to prove existence of some social structures in the blog networks.

  16. Network stratification analysis for identifying function-specific network layers.

    Science.gov (United States)

    Zhang, Chuanchao; Wang, Jiguang; Zhang, Chao; Liu, Juan; Xu, Dong; Chen, Luonan

    2016-04-22

    A major challenge of systems biology is to capture the rewiring of biological functions (e.g. signaling pathways) in a molecular network. To address this problem, we proposed a novel computational framework, namely network stratification analysis (NetSA), to stratify the whole biological network into various function-specific network layers corresponding to particular functions (e.g. KEGG pathways), which transform the network analysis from the gene level to the functional level by integrating expression data, the gene/protein network and gene ontology information altogether. The application of NetSA in yeast and its comparison with a traditional network-partition both suggest that NetSA can more effectively reveal functional implications of network rewiring and extract significant phenotype-related biological processes. Furthermore, for time-series or stage-wise data, the function-specific network layer obtained by NetSA is also shown to be able to characterize the disease progression in a dynamic manner. In particular, when applying NetSA to hepatocellular carcinoma and type 1 diabetes, we can derive functional spectra regarding the progression of the disease, and capture active biological functions (i.e. active pathways) in different disease stages. The additional comparison between NetSA and SPIA illustrates again that NetSA could discover more complete biological functions during disease progression. Overall, NetSA provides a general framework to stratify a network into various layers of function-specific sub-networks, which can not only analyze a biological network on the functional level but also investigate gene rewiring patterns in biological processes. PMID:26879865

  17. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  18. Analysis by fracture network modelling

    International Nuclear Information System (INIS)

    This report describes the Fracture Network Modelling and Performance Assessment Support performed by Golder Associates Inc. during the Heisei-11 (1999-2000) fiscal year. The primary objective of the Golder Associates work scope during HY-11 was to provide theoretical and review support to the JNC HY-12 Performance assessment effort. In addition, Golder Associates provided technical support to JNC for the Aespoe Project. Major efforts for performance assessment support included analysis of PAWorks pathways and software documentation, verification, and performance assessment visualization. Support for the Aespoe project including 'Task 4' predictive modelling of sorbing tracer transport in TRUE-1 rock block, and integrated hydrogeological and geochemical modelling of Aespoe island for 'Task 5'. Technical information about Golder Associates HY-11 support to JNC is provided in the appendices to this report. (author)

  19. Statistical Analysis of Bus Networks in India

    CERN Document Server

    Chatterjee, Atanu; Ramadurai, Gitakrishnan

    2015-01-01

    Through the past decade the field of network science has established itself as a common ground for the cross-fertilization of exciting inter-disciplinary studies which has motivated researchers to model almost every physical system as an interacting network consisting of nodes and links. Although public transport networks such as airline and railway networks have been extensively studied, the status of bus networks still remains in obscurity. In developing countries like India, where bus networks play an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer some of the basic questions on its evolution, growth, robustness and resiliency. In this paper, we model the bus networks of major Indian cities as graphs in \\textit{L}-space, and evaluate their various statistical properties using concepts from network science. Our analysis reveals a wide spectrum of network topology with the common underlying feature of small-world property. We observe tha...

  20. Multilayer motif analysis of brain networks

    OpenAIRE

    Battiston, Federico; Nicosia, Vincenzo; Chavez, Mario; Latora, Vito

    2016-01-01

    In the last decade network science has shed new light on the anatomical connectivity and on correlations in the activity of different areas of the human brain. The study of brain networks has made possible in fact to detect the central areas of a neural system, and to identify its building blocks by looking at overabundant small subgraphs, known as motifs. However, network analysis of the brain has so far mainly focused on structural and functional networks as separate entities. The recently ...

  1. Analysis of the Features of Network Words

    Institute of Scientific and Technical Information of China (English)

    阳艳萍

    2015-01-01

    The information society makes people’s lives gradually enter a digital state for living. And the popularity of the Inter⁃net has led to the unique phenomenon of network words. What impact will network and the combination of language bring about? This article will explore the relation between the phenomenon of network words and social context from the angle of so⁃cial linguistic through the analysis of network words and grammatical features.

  2. DeepOrgan: Multi-level Deep Convolutional Networks for Automated Pancreas Segmentation

    OpenAIRE

    Roth, Holger R.; Lu, Le; Farag, Amal; Shin, Hoo-Chang; Liu, Jiamin; Turkbey, Evrim; Summers, Ronald M.

    2015-01-01

    Automatic organ segmentation is an important yet challenging problem for medical image analysis. The pancreas is an abdominal organ with very high anatomical variability. This inhibits previous segmentation methods from achieving high accuracies, especially compared to other organs such as the liver, heart or kidneys. In this paper, we present a probabilistic bottom-up approach for pancreas segmentation in abdominal computed tomography (CT) scans, using multi-level deep convolutional networks...

  3. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  4. Social network analysis and supply chain management

    Directory of Open Access Journals (Sweden)

    Raúl Rodríguez Rodríguez

    2016-01-01

    Full Text Available This paper deals with social network analysis and how it could be integrated within supply chain management from a decision-making point of view. Even though the benefits of using social analysis have are widely accepted at both academic and industry/services context, there is still a lack of solid frameworks that allow decision-makers to connect the usage and obtained results of social network analysis – mainly both information and knowledge flows and derived results- with supply chain management objectives and goals. This paper gives an overview of social network analysis, the main social network analysis metrics, supply chain performance and, finally, it identifies how future frameworks could close the gap and link the results of social network analysis with the supply chain management decision-making processes.

  5. Automated red blood cell analysis compared with routine red blood cell morphology by smear review

    OpenAIRE

    Dr.Poonam Radadiya; Dr.Nandita Mehta; Dr.Hansa Goswami; Dr.R.N.Gonsai

    2015-01-01

    The RBC histogram is an integral part of automated haematology analysis and is now routinely available on all automated cell counters. This histogram and other associated complete blood count (CBC) parameters have been found abnormal in various haematological conditions and may provide major clues in the diagnosis and management of significant red cell disorders. Performing manual blood smears is important to ensure the quality of blood count results an...

  6. Kinetics analysis and automated online screening of aminocarbonylation of aryl halides in flow

    OpenAIRE

    Moore, Jason S.; Smith, Christopher D; Jensen, Klavs F.

    2016-01-01

    Temperature, pressure, gas stoichiometry, and residence time were varied to control the yield and product distribution of the palladium-catalyzed aminocarbonylation of aromatic bromides in both a silicon microreactor and a packed-bed tubular reactor. Automation of the system set points and product sampling enabled facile and repeatable reaction analysis with minimal operator supervision. It was observed that the reaction was divided into two temperature regimes. An automated system was used t...

  7. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  8. Social Network Analysis and informal trade

    OpenAIRE

    Walther, Olivier

    2015-01-01

    The objective of this article is to show how a formal approach to social networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources...

  9. Cross-Disciplinary Detection and Analysis of Network Motifs

    OpenAIRE

    Ngoc Tam L. Tran; Luke DeLuccia; McDonald, Aidan F; Chun-Hsi Huang

    2015-01-01

    The detection of network motifs has recently become an important part of network analysis across all disciplines. In this work, we detected and analyzed network motifs from undirected and directed networks of several different disciplines, including biological network, social network, ecological network, as well as other networks such as airlines, power grid, and co-purchase of political books networks. Our analysis revealed that undirected networks are similar at the basic three and four nod...

  10. Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design

    Science.gov (United States)

    Ordaz, Irian; Li, Wu; Campbell, Richard L.

    2014-01-01

    The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.

  11. Methods of automated cell analysis and their application in radiation biology

    International Nuclear Information System (INIS)

    The present review is concerned with the methods of automated analysis of biological microobjects and covers two groups into which all the systems of automated analysis can be divided-systems of flow ( flow cytometry) and scanning (image analysis systems) type. Particular emphasis has been placed on their use in radiobiological studies, namely, in the micronucleus test, a cytogenetic assay for monitoring the clastogenic action of ionizing radiation commonly used at present. Examples of suing methods described and actual setups in other biomedical researches are given. Analysis of advantages and disadvantages of the methods of automated cell analysis enables to choose more thoroughly between the systems of flow and scanning type to use them in particular research

  12. An investigation and comparison on network performance analysis

    OpenAIRE

    2012-01-01

    This thesis is generally about network performance analysis. It contains two parts. The theory part summarizes what network performance is and inducts the methods of doing network performance analysis. To answer what network performance is, a study into what network services are is done. And based on the background research, there are two important network performance metrics: Network delay and Throughput should be included in network performance analysis. Among the methods of network a...

  13. SOCIAL NETWORK ANALYSIS ON GRAPH THEORY

    Directory of Open Access Journals (Sweden)

    DHARMAIAH GURRAM AND N.VEDAVATHI

    2013-02-01

    Full Text Available Although graph theory is one of the younger branches of mathematics, it isfundamental to a number of applied fields, including operations research, computerscience, and social network analysis. In this paper we discuss the basic concepts ofgraph theory from the point of view of social network analysis.

  14. A novel automated image analysis method for accurate adipocyte quantification

    OpenAIRE

    Osman, Osman S.; Selway, Joanne L; Kępczyńska, Małgorzata A; Stocker, Claire J.; O’Dowd, Jacqueline F; Cawthorne, Michael A.; Arch, Jonathan RS; Jassim, Sabah; Langlands, Kenneth

    2013-01-01

    Increased adipocyte size and number are associated with many of the adverse effects observed in metabolic disease states. While methods to quantify such changes in the adipocyte are of scientific and clinical interest, manual methods to determine adipocyte size are both laborious and intractable to large scale investigations. Moreover, existing computational methods are not fully automated. We, therefore, developed a novel automatic method to provide accurate measurements of the cross-section...

  15. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    OpenAIRE

    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  16. Object Type Recognition for Automated Analysis of Protein Subcellular Location

    OpenAIRE

    Zhao, Ting; Velliste, Meel; Boland, Michael V.; Murphy, Robert F.

    2005-01-01

    The new field of location proteomics seeks to provide a comprehensive, objective characterization of the subcellular locations of all proteins expressed in a given cell type. Previous work has demonstrated that automated classifiers can recognize the patterns of all major subcellular organelles and structures in fluorescence microscope images with high accuracy. However, since some proteins may be present in more than one organelle, this paper addresses a more difficult task: recognizing a pa...

  17. Automated forensic extraction of encryption keys using behavioural analysis

    OpenAIRE

    Owen, Gareth

    2012-01-01

    In this paper we describe a technique for automatic algorithm identification and information extraction from unknown binaries. We emulate the binary using PyEmu forcing complete code coverage whilst simultaneously examining its behavior. Our behavior matcher then identifies specific algorithmic behavior and extracts information. We demonstrate the use of this technique for automated extraction of encryption keys from an unseen program with no prior knowledge about its implementation. Our tech...

  18. AutoGate: automating analysis of flow cytometry data

    OpenAIRE

    Meehan, Stephen; Walther, Guenther; Moore, Wayne; Orlova, Darya; Meehan, Connor; Parks, David; Ghosn, Eliver; Philips, Megan; Mitsunaga, Erin; Waters, Jeffrey; Kantor, Aaron; Okamura, Ross; Owumi, Solomon; Yang, Yang; Herzenberg, Leonard A.

    2014-01-01

    Nowadays, one can hardly imagine biology and medicine without flow cytometry to measure CD4 T cell counts in HIV, follow bone marrow transplant patients, characterize leukemias, etc. Similarly, without flow cytometry, there would be a bleak future for stem cell deployment, HIV drug development and full characterization of the cells and cell interactions in the immune system. But while flow instruments have improved markedly, the development of automated tools for processing and analyzing flow...

  19. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    OpenAIRE

    Tianhong Song; Sven Köhler; Bertram Ludäscher; James Hanken; Maureen Kelly; David Lowery; Macklin, James A.; Morris, Paul J.; Morris, Robert A.

    2014-01-01

    Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfal...

  20. 3D Assembly Group Analysis for Cognitive Automation

    OpenAIRE

    Christian Brecher; Thomas Breitbach; Simon Müller; Marcel Ph. Mayer; Barbara Odenthal; Schlick, Christopher M.; Werner Herfs

    2012-01-01

    A concept that allows the cognitive automation of robotic assembly processes is introduced. An assembly cell comprised of two robots was designed to verify the concept. For the purpose of validation a customer-defined part group consisting of Hubelino bricks is assembled. One of the key aspects for this process is the verification of the assembly group. Hence a software component was designed that utilizes the Microsoft Kinect to perceive both depth and color data in the assembly area. This i...

  1. Analysis of Defense Language Institute automated student questionnaire data

    OpenAIRE

    Strycharz, Theodore M.

    1996-01-01

    This thesis explores the dimensionality of the Defense Language Institute's (DLI) primary student feed back tool - the Automated Student Questionnaire (ASQ). In addition a data set from ASQ 2.0 (the newest version) is analyzed for trends in student satisfaction across the sub-scales of sex, pay grade, and Defense Language Proficiency Test (DLPT) results. The method of principal components is used to derive initial factors. Although an interpretation of those factors seems plausible, these are...

  2. Library Automation, Networking, and Other Online and New Technology Costs in Academic Libraries.

    Science.gov (United States)

    Pastine, Maureen; Kacena, Carolyn

    1994-01-01

    Presents budgeting requirements that are needed to meet the electronic library needs in small- and medium-sized academic libraries based on library automation activities at Southern Methodist University (Texas) and a review of the literature. Highlights include cooperative ventures; fund raising; personnel needs; benefits of automation; strategic…

  3. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Wu, Chao-Chan; Yao, Ching-Bang

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  4. Dynamic Analysis of Structures Using Neural Networks

    Directory of Open Access Journals (Sweden)

    N. Ahmadi

    2008-01-01

    Full Text Available In the recent years, neural networks are considered as the best candidate for fast approximation with arbitrary accuracy in the time consuming problems. Dynamic analysis of structures against earthquake has the time consuming process. We employed two kinds of neural networks: Generalized Regression neural network (GR and Back-Propagation Wavenet neural network (BPW, for approximating of dynamic time history response of frame structures. GR is a traditional radial basis function neural network while BPW categorized as a wavelet neural network. In BPW, sigmoid activation functions of hidden layer neurons are substituted with wavelets and weights training are achieved using Scaled Conjugate Gradient (SCG algorithm. Comparison the results of BPW with those of GR in the dynamic analysis of eight story steel frame indicates that accuracy of the properly trained BPW was better than that of GR and therefore, BPW can be efficiently used for approximate dynamic analysis of structures.

  5. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  6. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  7. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  8. Physical explosion analysis in heat exchanger network design

    Science.gov (United States)

    Pasha, M.; Zaini, D.; Shariff, A. M.

    2016-06-01

    The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.

  9. Application of quantum dots as analytical tools in automated chemical analysis: A review

    Energy Technology Data Exchange (ETDEWEB)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, Joao A.C.; Prior, Joao A.V.; Marques, Karine L. [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal)

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer Review on quantum dots application in automated chemical analysis. Black-Right-Pointing-Pointer Automation by using flow-based techniques. Black-Right-Pointing-Pointer Quantum dots in liquid chromatography and capillary electrophoresis. Black-Right-Pointing-Pointer Detection by fluorescence and chemiluminescence. Black-Right-Pointing-Pointer Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  10. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  11. Sharing Feelings Online: Studying Emotional Well-Being via Automated Text Analysis of Facebook Posts

    Directory of Open Access Journals (Sweden)

    Michele eSettanni

    2015-07-01

    Full Text Available Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users’ Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed.

  12. Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts.

    Science.gov (United States)

    Settanni, Michele; Marengo, Davide

    2015-01-01

    Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users' Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed. PMID:26257692

  13. Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts

    Science.gov (United States)

    Settanni, Michele; Marengo, Davide

    2015-01-01

    Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users’ Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed. PMID:26257692

  14. Automated Microfluidic Platform for Serial Polymerase Chain Reaction and High-Resolution Melting Analysis.

    Science.gov (United States)

    Cao, Weidong; Bean, Brian; Corey, Scott; Coursey, Johnathan S; Hasson, Kenton C; Inoue, Hiroshi; Isano, Taisuke; Kanderian, Sami; Lane, Ben; Liang, Hongye; Murphy, Brian; Owen, Greg; Shinoda, Nobuhiko; Zeng, Shulin; Knight, Ivor T

    2016-06-01

    We report the development of an automated genetic analyzer for human sample testing based on microfluidic rapid polymerase chain reaction (PCR) with high-resolution melting analysis (HRMA). The integrated DNA microfluidic cartridge was used on a platform designed with a robotic pipettor system that works by sequentially picking up different test solutions from a 384-well plate, mixing them in the tips, and delivering mixed fluids to the DNA cartridge. A novel image feedback flow control system based on a Canon 5D Mark II digital camera was developed for controlling fluid movement through a complex microfluidic branching network without the use of valves. The same camera was used for measuring the high-resolution melt curve of DNA amplicons that were generated in the microfluidic chip. Owing to fast heating and cooling as well as sensitive temperature measurement in the microfluidic channels, the time frame for PCR and HRMA was dramatically reduced from hours to minutes. Preliminary testing results demonstrated that rapid serial PCR and HRMA are possible while still achieving high data quality that is suitable for human sample testing. PMID:25827436

  15. Social network analysis and supply chain management

    OpenAIRE

    Raúl Rodríguez Rodríguez

    2016-01-01

    This paper deals with social network analysis and how it could be integrated within supply chain management from a decision-making point of view. Even though the benefits of using social analysis have are widely accepted at both academic and industry/services context, there is still a lack of solid frameworks that allow decision-makers to connect the usage and obtained results of social network analysis – mainly both information and knowledge flows and derived results- with supply chain manag...

  16. Network analysis of Zentralblatt MATH data

    OpenAIRE

    Cerinšek, Monika; Batagelj, Vladimir

    2014-01-01

    We analyze the data about works (papers, books) from the time period 1990-2010 that are collected in Zentralblatt MATH database. The data were converted into four 2-mode networks (works $\\times$ authors, works $\\times$ journals, works $\\times$ keywords and works $\\times$ MSCs) and into a partition of works by publication year. The networks were analyzed using Pajek -- a program for analysis and visualization of large networks. We explore the distributions of some properties of works and the c...

  17. Network Analysis for the Industrial Company

    OpenAIRE

    Batiha, Tarek

    2014-01-01

    The main purpose of this thesis was to analyze the network environment of the MPS Mont company, then based on the analysis to offer some improvements. Another important goal was to get knowledge of processes in a real company. The improvements in the network environment had to be without any expenditures for the company. The solution was based on opensource software. The final result of this thesis was documentation of the company's network and deployment of a honeypot based...

  18. Clustered Numerical Data Analysis Using Markov Lie Monoid Based Networks

    Science.gov (United States)

    Johnson, Joseph

    2016-03-01

    We have designed and build an optimal numerical standardization algorithm that links numerical values with their associated units, error level, and defining metadata thus supporting automated data exchange and new levels of artificial intelligence (AI). The software manages all dimensional and error analysis and computational tracing. Tables of entities verses properties of these generalized numbers (called ``metanumbers'') support a transformation of each table into a network among the entities and another network among their properties where the network connection matrix is based upon a proximity metric between the two items. We previously proved that every network is isomorphic to the Lie algebra that generates continuous Markov transformations. We have also shown that the eigenvectors of these Markov matrices provide an agnostic clustering of the underlying patterns. We will present this methodology and show how our new work on conversion of scientific numerical data through this process can reveal underlying information clusters ordered by the eigenvalues. We will also show how the linking of clusters from different tables can be used to form a ``supernet'' of all numerical information supporting new initiatives in AI.

  19. Nuclear spectral analysis via artificial neural networks for waste handling

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States). Environmental Molecular Sciences Lab.; Troyer, G.L. [Westinghouse Hanford Co., Richland, WA (United States)

    1995-08-01

    Enormous amounts of hazardous waste were generated by more than 40 years of plutonium production at the US Department of Energy`s Hanford site. A major national and international mission is to manage the existing waste and to restore the surrounding environment in a cost-effective manner. The objective of their research is to demonstrate the information processing capabilities of the neural network paradigm in real-time, automated identification of contaminants. In this paper two applications of artificial neural networks (ANNs) in nuclear spectroscopy analysis are discussed. In the first application, an ANN assigns quality coefficients to alpha particle energy spectra. These spectra are used to detect plutonium contamination in the work environment. The quality coefficients represent the levels of spectral degradation caused by miscalibration and foreign matter affecting the instruments. A set of spectra was labeled with quality coefficients by an expert and used to train the ANN expert system. The investigation shows that the expert knowledge of spectral quality can be transferred to an ANN system. The second application combines a portable gamma-ray spectrometer with an ANN to automatically identify radioactive isotopes in real-time. Two neural network paradigms are examined and compared: the linear perceptron and the optimal linear associative memory (OLAM). Both networks have a linear response and are useful in determining the composition of an unknown sample when the spectrum of the unknown is a linear superposition of known spectra.

  20. Nuclear spectral analysis via artificial neural networks for waste handling

    International Nuclear Information System (INIS)

    Enormous amounts of hazardous waste were generated by more than 40 years of plutonium production at the US Department of Energy's Hanford site. A major national and international mission is to manage the existing waste and to restore the surrounding environment in a cost-effective manner. The objective of their research is to demonstrate the information processing capabilities of the neural network paradigm in real-time, automated identification of contaminants. In this paper two applications of artificial neural networks (ANNs) in nuclear spectroscopy analysis are discussed. In the first application, an ANN assigns quality coefficients to alpha particle energy spectra. These spectra are used to detect plutonium contamination in the work environment. The quality coefficients represent the levels of spectral degradation caused by miscalibration and foreign matter affecting the instruments. A set of spectra was labeled with quality coefficients by an expert and used to train the ANN expert system. The investigation shows that the expert knowledge of spectral quality can be transferred to an ANN system. The second application combines a portable gamma-ray spectrometer with an ANN to automatically identify radioactive isotopes in real-time. Two neural network paradigms are examined and compared: the linear perceptron and the optimal linear associative memory (OLAM). Both networks have a linear response and are useful in determining the composition of an unknown sample when the spectrum of the unknown is a linear superposition of known spectra

  1. Multilayer motif analysis of brain networks

    CERN Document Server

    Battiston, Federico; Chavez, Mario; Latora, Vito

    2016-01-01

    In the last decade network science has shed new light on the anatomical connectivity and on correlations in the activity of different areas of the human brain. The study of brain networks has made possible in fact to detect the central areas of a neural system, and to identify its building blocks by looking at overabundant small subgraphs, known as motifs. However, network analysis of the brain has so far mainly focused on structural and functional networks as separate entities. The recently developed mathematical framework of multi-layer networks allows to perform a multiplex analysis of the human brain where the structural and functional layers are considered at the same time. In this work we describe how to classify subgraphs in multiplex networks, and we extend motif analysis to networks with many layers. We then extract multi-layer motifs in brain networks of healthy subjects by considering networks with two layers, respectively obtained from diffusion and functional magnetic resonance imaging. Results i...

  2. 3rd International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2014-01-01

    This volume compiles the major results of conference participants from the "Third International Conference in Network Analysis" held at the Higher School of Economics, Nizhny Novgorod in May 2013, with the aim to initiate further joint research among different groups. The contributions in this book cover a broad range of topics relevant to the theory and practice of network analysis, including the reliability of complex networks, software, theory, methodology, and applications.  Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network has brought together researchers, practitioners from numerous fields such as operations research, computer science, transportation, energy, biomedicine, computational neuroscience and social sciences. In addition, new approaches and computer environments such as parallel computing, grid computing, cloud computing, and quantum computing have helped to solve large scale...

  3. Conceptual design for comprehensive automation in radiochemical analysis of bioassay samples

    International Nuclear Information System (INIS)

    Bioassay Laboratory of Health Physics Division is entrusted with the task of carrying out the bioassay monitoring of occupational workers from various plants/divisions of BARC for various radionuclides like Pu, U, Th, 90Sr, 3H etc. On the average about 1400-1500 analyses are performed on 700-800 urine samples collected annually from radiation workers. The workload has increased by 1.5 to 2.0 times in recent past and is expected to increase further due to expanding nuclear programmes of the Department. Therefore, it was planned to carry out automation in various stages of bioassay sample handling, processing and analysis under the XI plan programme. Automation work in Bioassay Lab. is planned to be taken-up in three stages namely, automation in initial processing of i) urine samples, ii) fecal samples and iii) automation in radiochemical analysis of bioassay samples. In the initial phase, automation in radiochemical analysis of bioassay samples has been taken up

  4. Object-oriented database design for the contaminant analysis automation project

    International Nuclear Information System (INIS)

    The Contaminant Analysis Automation project's automated soil analysis laboratory uses an Object-Oriented database for storage of runtime and archive information. Data which is generated by the processing of a sample, and is relevant for verification of the specifics of that process, is retained in the database. The database also contains intermediate results which are generated by one step of the process and used for decision making by later steps. The description of this database reveals design considerations of the objects used to model the behavior of the chemical laboratory and its components

  5. Automation of finite element analysis in pressure equipments design. Applications in energy and petrochemistry industries

    International Nuclear Information System (INIS)

    Pressure Equipments are used in different Petrochemistry and Energy industries, like in thermal ou nuclear power plants, oil refineries, and chemical plants. These industries have to insure the safety and environmental conditions. Design of Pressure Equipments have to respect Codes and Standards. For those Pressure Vessels with Neighbouring nozzles, or under load cases different from Pressure loadings (Seismic, Concentrated, etc..), Finite Element simulation remains the unique and accepted method for verifying equipments design according to Codes and Standards. This paper presents application of automated procedure for stress and criteria verification in Energy and Petrochemistry industries. This automated procedure allows to insure the analysis quality and to reduce the analysis time. (author)

  6. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin. [New Mexico Institute of Mining and Technology, Socorro, NM; Willard, Gerald [Department of Defense, Ft. Meade, MD

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving - we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  7. Automated Image Processing for the Analysis of DNA Repair Dynamics

    CERN Document Server

    Riess, Thorsten; Tomas, Martin; Ferrando-May, Elisa; Merhof, Dorit

    2011-01-01

    The efficient repair of cellular DNA is essential for the maintenance and inheritance of genomic information. In order to cope with the high frequency of spontaneous and induced DNA damage, a multitude of repair mechanisms have evolved. These are enabled by a wide range of protein factors specifically recognizing different types of lesions and finally restoring the normal DNA sequence. This work focuses on the repair factor XPC (xeroderma pigmentosum complementation group C), which identifies bulky DNA lesions and initiates their removal via the nucleotide excision repair pathway. The binding of XPC to damaged DNA can be visualized in living cells by following the accumulation of a fluorescent XPC fusion at lesions induced by laser microirradiation in a fluorescence microscope. In this work, an automated image processing pipeline is presented which allows to identify and quantify the accumulation reaction without any user interaction. The image processing pipeline comprises a preprocessing stage where the ima...

  8. 3D Assembly Group Analysis for Cognitive Automation

    Directory of Open Access Journals (Sweden)

    Christian Brecher

    2012-01-01

    Full Text Available A concept that allows the cognitive automation of robotic assembly processes is introduced. An assembly cell comprised of two robots was designed to verify the concept. For the purpose of validation a customer-defined part group consisting of Hubelino bricks is assembled. One of the key aspects for this process is the verification of the assembly group. Hence a software component was designed that utilizes the Microsoft Kinect to perceive both depth and color data in the assembly area. This information is used to determine the current state of the assembly group and is compared to a CAD model for validation purposes. In order to efficiently resolve erroneous situations, the results are interactively accessible to a human expert. The implications for an industrial application are demonstrated by transferring the developed concepts to an assembly scenario for switch-cabinet systems.

  9. Automation of Morphometric Measurements for Planetary Surface Analysis and Cartography

    Science.gov (United States)

    Kokhanov, A. A.; Bystrov, A. Y.; Kreslavsky, M. A.; Matveev, E. V.; Karachevtseva, I. P.

    2016-06-01

    For automation of measurements of morphometric parameters of surface relief various tools were developed and integrated into GIS. We have created a tool, which calculates statistical characteristics of the surface: interquartile range of heights, and slopes, as well as second derivatives of height fields as measures of topographic roughness. Other tools were created for morphological studies of craters. One of them allows automatic placing of topographic profiles through the geometric center of a crater. Another tool was developed for calculation of small crater depths and shape estimation, using C++ programming language. Additionally, we have prepared tool for calculating volumes of relief features from DTM rasters. The created software modules and models will be available in a new developed web-GIS system, operating in distributed cloud environment.

  10. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  11. Boolean Factor Analysis by Attractor Neural Network

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Muraviev, I. P.; Polyakov, P.Y.

    2007-01-01

    Roč. 18, č. 3 (2007), s. 698-707. ISSN 1045-9227 R&D Projects: GA AV ČR 1ET100300419; GA ČR GA201/05/0079 Institutional research plan: CEZ:AV0Z10300504 Keywords : recurrent neural network * Hopfield-like neural network * associative memory * unsupervised learning * neural network architecture * neural network application * statistics * Boolean factor analysis * dimensionality reduction * features clustering * concepts search * information retrieval Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.769, year: 2007

  12. Bayesian networks with applications in reliability analysis

    OpenAIRE

    Langseth, Helge

    2002-01-01

    A common goal of the papers in this thesis is to propose, formalize and exemplify the use of Bayesian networks as a modelling tool in reliability analysis. The papers span work in which Bayesian networks are merely used as a modelling tool (Paper I), work where models are specially designed to utilize the inference algorithms of Bayesian networks (Paper II and Paper III), and work where the focus has been on extending the applicability of Bayesian networks to very large domains (Paper IV and ...

  13. Application of local computer networks to nuclear power plant problems at the level of automated control of the production process

    International Nuclear Information System (INIS)

    The requirements placed on the hardware and software for the automated system of production control at a nuclear power plant are summarized. In the design of the Temelin nuclear power plant, this system is local network-based, with two-way communication between the local stations. The control functions are distributed among 6 local stations, viz. the file server, interactive terminal, information terminal, data storage terminal, calculation terminal and communication terminal. A brief characterization is given of the necessary software, including both the system software and the basic and applications user software. (Z.M.)

  14. Networked Software’ Performance Metrics and Analysis with IBM SVC Config Advisor

    Directory of Open Access Journals (Sweden)

    Mr. Kushal S. Patel

    2014-06-01

    Full Text Available IBM SVC is a virtualization product by IBM which deals with storage virtualization. In this IBM SVC, numbers of hetrogenous hosts are connected by means of high speed communication network. To manage configuration of these networked component is very difficult task. To automate this configuration checking is need of the world. Hence IBM SVC Config Advisor tool is developed by IBM. This tool deals with remote configuration check for storage stand including IBM SVC and Storwize products. This paper introduces the IBM SVC Config Advisor tool along with performance statistics. This paper mainly deals with the networked tool’s performance statistics collection and analysis. Here IBM SVC Config Advisor is used as networked Tool for analysis. This paper can be useful to analyze software which are highly network dependent in nature.

  15. Automated reduction and interpretation of multidimensional mass spectra for analysis of complex peptide mixtures

    Science.gov (United States)

    Gambin, Anna; Dutkowski, Janusz; Karczmarski, Jakub; Kluge, Boguslaw; Kowalczyk, Krzysztof; Ostrowski, Jerzy; Poznanski, Jaroslaw; Tiuryn, Jerzy; Bakun, Magda; Dadlez, Michal

    2007-01-01

    Here we develop a fully automated procedure for the analysis of liquid chromatography-mass spectrometry (LC-MS) datasets collected during the analysis of complex peptide mixtures. We present the underlying algorithm and outcomes of several experiments justifying its applicability. The novelty of our approach is to exploit the multidimensional character of the datasets. It is common knowledge that highly complex peptide mixtures can be analyzed by liquid chromatography coupled with mass spectrometry, but we are not aware of any existing automated MS spectra interpretation procedure designed to take into account the multidimensional character of the data. Our work fills this gap by providing an effective algorithm for this task, allowing for automated conversion of raw data to the list of masses of peptides.

  16. Automated analysis for scintigraphic evaluation of gastric emptying using invariant moments.

    Science.gov (United States)

    Abutaleb, A; Delalic, Z J; Ech, R; Siegel, J A

    1989-01-01

    This study introduces a method for automated analysis of the standard solid-meal gastric emptying test. The purpose was to develop a diagnostic tool to characterize more reproducibly abnormalities of solid-phase gastric emptying. The processing of gastric emptying is automated using geometrical moments that are invariant to scaling, rotation, and shift. Twenty subjects were studied. The first step was to obtain images of the stomach using a nuclear gamma camera immediately after the subject had eaten a radio-labeled meal. The second step was to process and analyze the images by a recently developed automated gastric emptying analysis (AGEA) method, which determines the gastric contour and the geometrical properties include such parameters as area, centroid, orientation, and moments of inertia. Statistical tests showed that some of the moments were sensitive to the patient's gastric status (normal versus abnormal). The difference between the normal and abnormal patients became noticeable approximately 1 h after meal ingestion. PMID:18230536

  17. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    Science.gov (United States)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  18. Analysis and Testing of Mobile Wireless Networks

    Science.gov (United States)

    Alena, Richard; Evenson, Darin; Rundquist, Victor; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Wireless networks are being used to connect mobile computing elements in more applications as the technology matures. There are now many products (such as 802.11 and 802.11b) which ran in the ISM frequency band and comply with wireless network standards. They are being used increasingly to link mobile Intranet into Wired networks. Standard methods of analyzing and testing their performance and compatibility are needed to determine the limits of the technology. This paper presents analytical and experimental methods of determining network throughput, range and coverage, and interference sources. Both radio frequency (BE) domain and network domain analysis have been applied to determine wireless network throughput and range in the outdoor environment- Comparison of field test data taken under optimal conditions, with performance predicted from RF analysis, yielded quantitative results applicable to future designs. Layering multiple wireless network- sooners can increase performance. Wireless network components can be set to different radio frequency-hopping sequences or spreading functions, allowing more than one sooner to coexist. Therefore, we ran multiple 802.11-compliant systems concurrently in the same geographical area to determine interference effects and scalability, The results can be used to design of more robust networks which have multiple layers of wireless data communication paths and provide increased throughput overall.

  19. Extending Stochastic Network Calculus to Loss Analysis

    Directory of Open Access Journals (Sweden)

    Chao Luo

    2013-01-01

    Full Text Available Loss is an important parameter of Quality of Service (QoS. Though stochastic network calculus is a very useful tool for performance evaluation of computer networks, existing studies on stochastic service guarantees mainly focused on the delay and backlog. Some efforts have been made to analyse loss by deterministic network calculus, but there are few results to extend stochastic network calculus for loss analysis. In this paper, we introduce a new parameter named loss factor into stochastic network calculus and then derive the loss bound through the existing arrival curve and service curve via this parameter. We then prove that our result is suitable for the networks with multiple input flows. Simulations show the impact of buffer size, arrival traffic, and service on the loss factor.

  20. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  1. Topological analysis of urban street networks

    OpenAIRE

    Bin Jiang; Christophe Claramunt

    2004-01-01

    The authors propose a topological analysis of large urban street networks based on a computational and functional graph representation. This representation gives a functional view in which vertices represent named streets and edges represent street intersections. A range of graph measures, including street connectivity, average path length, and clustering coefficient, are computed for structural analysis. In order to characterise different clustering degrees of streets in a street network the...

  2. Automated striatal uptake analysis of 18F-FDOPA PET images applied to Parkinson's disease patients

    International Nuclear Information System (INIS)

    6-[18F]Fluoro-L-DOPA (FDOPA) is a radiopharmaceutical valuable for assessing the presynaptic dopaminergic function when used with positron emission tomography (PET). More specifically, the striatal-to-occipital ratio (SOR) of FDOPA uptake images has been extensively used as a quantitative parameter in these PET studies. Our aim was to develop an easy, automated method capable of performing objective analysis of SOR in FDOPA PET images of Parkinson's disease (PD) patients. Brain images from FDOPA PET studies of 21 patients with PD and 6 healthy subjects were included in our automated striatal analyses. Images of each individual were spatially normalized into an FDOPA template. Subsequently, the image slice with the highest level of basal ganglia activity was chosen among the series of normalized images. Also, the immediate preceding and following slices of the chosen image were then selected. Finally, the summation of these three images was used to quantify and calculate the SOR values. The results obtained by automated analysis were compared with manual analysis by a trained and experienced image processing technologist. The SOR values obtained from the automated analysis had a good agreement and high correlation with manual analysis. The differences in caudate, putamen, and striatum were -0.023, -0.029, and -0.025, respectively; correlation coefficients 0.961, 0.957, and 0.972, respectively. We have successfully developed a method for automated striatal uptake analysis of FDOPA PET images. There was no significant difference between the SOR values obtained from this method and using manual analysis. Yet it is an unbiased time-saving and cost-effective program and easy to implement on a personal computer. (author)

  3. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  4. Performance Metrics Analysis of Torus Embedded Hypercube Interconnection Network

    Directory of Open Access Journals (Sweden)

    N. Gopalakrishna Kini

    2009-09-01

    Full Text Available Advantages of hypercube network and torus topology are used to derive an embedded architecture for product network known as torus embedded hypercube scalable interconnection network. This paper analyzes torus embedded hypercube network pertinent to parallel architecture. The network metrics are used to show how good embedded network can be designed for parallel computation. Network parameter analysis and comparison of embedded network with basic networks is presented.

  5. Automating Mid- and Long-Range Scheduling for the NASA Deep Space Network

    Science.gov (United States)

    Johnston, Mark D.; Tran, Daniel

    2012-01-01

    NASA has recently deployed a new mid-range scheduling system for the antennas of the Deep Space Network (DSN), called Service Scheduling Software, or S(sup 3). This system was designed and deployed as a modern web application containing a central scheduling database integrated with a collaborative environment, exploiting the same technologies as social web applications but applied to a space operations context. This is highly relevant to the DSN domain since the network schedule of operations is developed in a peer-to-peer negotiation process among all users of the DSN. These users represent not only NASA's deep space missions, but also international partners and ground-based science and calibration users. The initial implementation of S(sup 3) is complete and the system has been operational since July 2011. This paper describes some key aspects of the S(sup 3) system and on the challenges of modeling complex scheduling requirements and the ongoing extension of S(sup 3) to encompass long-range planning, downtime analysis, and forecasting, as the next step in developing a single integrated DSN scheduling tool suite to cover all time ranges.

  6. An Integrated Solution for both Monitoring and Controlling for Automization Using Wireless Sensor Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    M Gnana Seelan

    2013-02-01

    Full Text Available Temperature monitoring plays a major role in controlling it according to its varied conditions. Thisprocess is common in all critical areas like data centre, server rooms, grid rooms and other datacommunication equipped rooms. This is mandatory for each organization/industry to impart suchprocess, as most of the critical data would be in data centre along with their network infrastructure whichhaving various electronic, electrical and mechanical devices are involved for data transmissions. Thesedevices are very much depend on the environmental factors such as temperature, moisture, humidity etc.,and also emit heat in the form of thermal energy when they are in functional. To overcome these heats,the server/data centre room(s would be engaged with multiple (distributed air-conditioning (ac systemsto provide cooling environment and maintain the temperature level of the room. The proposed paper isthe study of automization of monitoring and controlling temperature as per desired requirements withwsn network

  7. Automated analysis of small animal PET studies through deformable registration to an atlas

    Energy Technology Data Exchange (ETDEWEB)

    Gutierrez, Daniel F. [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Zaidi, Habib [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Geneva University, Geneva Neuroscience Center, Geneva (Switzerland); University of Groningen, Department of Nuclear Medicine and Molecular Imaging, University Medical Center Groningen, Groningen (Netherlands)

    2012-11-15

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  8. Automated analysis of small animal PET studies through deformable registration to an atlas

    International Nuclear Information System (INIS)

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  9. Artificial neural networks for plasma spectroscopy analysis

    International Nuclear Information System (INIS)

    Artificial neural networks have been applied to a variety of signal processing and image recognition problems. Of the several common neural models the feed-forward, back-propagation network is well suited for the analysis of scientific laboratory data, which can be viewed as a pattern recognition problem. The authors present a discussion of the basic neural network concepts and illustrate its potential for analysis of experiments by applying it to the spectra of laser produced plasmas in order to obtain estimates of electron temperatures and densities. Although these are high temperature and density plasmas, the neural network technique may be of interest in the analysis of the low temperature and density plasmas characteristic of experiments and devices in gaseous electronics

  10. Visualization and Analysis of Complex Covert Networks

    DEFF Research Database (Denmark)

    Memon, Bisharat

    This report discusses and summarize the results of my work so far in relation to my Ph.D. project entitled "Visualization and Analysis of Complex Covert Networks". The focus of my research is primarily on development of methods and supporting tools for visualization and analysis of networked...... systems that are covert and hence inherently complex. My Ph.D. is positioned within the wider framework of CrimeFighter project. The framework envisions a number of key knowledge management processes that are involved in the workflow, and the toolbox provides supporting tools to assist human end......-users (intelligence analysts) in harvesting, filtering, storing, managing, structuring, mining, analyzing, interpreting, and visualizing data about offensive networks. The methods and tools proposed and discussed in this work can also be applied to analysis of more generic complex networks....

  11. 1st International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2013-01-01

    This volume contains a selection of contributions from the "First International Conference in Network Analysis," held at the University of Florida, Gainesville, on December 14-16, 2011. The remarkable diversity of fields that take advantage of Network Analysis makes the endeavor of gathering up-to-date material in a single compilation a useful, yet very difficult, task. The purpose of this volume is to overcome this difficulty by collecting the major results found by the participants and combining them in one easily accessible compilation. Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network is bringing together researchers, practitioners and other scientific communities from numerous fields such as Operations Research, Computer Science, Transportation, Energy, Social Sciences, and more. The contributions not only come from different fields, but also cover a broad range of topics relevant to the...

  12. Miniaturized Mass-Spectrometry-Based Analysis System for Fully Automated Examination of Conditioned Cell Culture Media

    NARCIS (Netherlands)

    Weber, E.; Pinkse, M.W.H.; Bener-Aksam, E.; Vellekoop, M.J.; Verhaert, P.D.E.M.

    2012-01-01

    We present a fully automated setup for performing in-line mass spectrometry (MS) analysis of conditioned media in cell cultures, in particular focusing on the peptides therein. The goal is to assess peptides secreted by cells in different culture conditions. The developed system is compatible with M

  13. Scanning probe image wizard: A toolbox for automated scanning probe microscopy data analysis

    Science.gov (United States)

    Stirling, Julian; Woolley, Richard A. J.; Moriarty, Philip

    2013-11-01

    We describe SPIW (scanning probe image wizard), a new image processing toolbox for SPM (scanning probe microscope) images. SPIW can be used to automate many aspects of SPM data analysis, even for images with surface contamination and step edges present. Specialised routines are available for images with atomic or molecular resolution to improve image visualisation and generate statistical data on surface structure.

  14. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... costs incurred and amounts collected. Data on costs and corresponding recovery rates for debts of... efforts are likely to exceed recoveries, and assist in evaluating offers in compromise. (b) Consider the... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation;...

  15. Development of a novel and automated fluorescent immunoassay for the analysis of beta-lactam antibiotics

    NARCIS (Netherlands)

    Benito-Pena, E.; Moreno-Bondi, M.C.; Orellana, G.; Maquieira, K.; Amerongen, van A.

    2005-01-01

    An automated immunosensor for the rapid and sensitive analysis of penicillin type -lactam antibiotics has been developed and optimized. An immunogen was prepared by coupling the common structure of the penicillanic -lactam antibiotics, i.e., 6-aminopenicillanic acid to keyhole limpet hemocyanin. Pol

  16. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  17. Automated Dsm Extraction from Uav Images and Performance Analysis

    Science.gov (United States)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  18. An Analysis of Intelligent Automation Demands in Taiwanese Firms

    Directory of Open Access Journals (Sweden)

    Ying-Mei Tai

    2016-03-01

    Full Text Available To accurately elucidate the production deployment, process intelligent automation (IA, and production bottlenecks of Taiwanese companies, as well as the IA application status, this research conducted a structured questionnaire survey on the participants of the IA promotion activities arranged by the Industrial Development Bureau, Ministry of Economic Affairs. A total of 35 valid questionnaires were recovered. Research findings indicated that the majority of participants were large-scale enterprises. These enterprises anticipated adding production bases in Taiwan and China to transit and upgrade their operations or strengthen influence in the domestic market. The degrees of various process IA and production bottlenecks were relatively low, which was associated with the tendency to small amount of diversified products. The majority of sub-categories of hardware equipment and simulation technologies have reached maturity, and the effective application of these technologies can enhance production efficiency. Technologies of intelligent software remain immature and need further development and application. More importantly, they can meet customer values and create new business models, so as to satisfy the purpose of sustainable development.

  19. Automated analysis of sleep-wake state in rats.

    Science.gov (United States)

    Stephenson, Richard; Caron, Aimee M; Cassel, Daniel B; Kostela, Jennifer C

    2009-11-15

    A fully automated computer-based sleep scoring system is described and validated for use in rats. The system was designed to emulate visual sleep scoring by using the same basic features of the electroencephalogram (EEG) and electromyogram (EMG), and a similar set of decision-making rules. State indices are calculated for each 5s epoch by combination of amplitudes (microVrms) of 6 filtered EEG frequency bands (EEGlo, d.c.-1.5Hz; delta, 1.5-6Hz; theta, 6-9Hz; alpha, 10.5-15Hz; beta, 22-30Hz; gamma, 35-45Hz; Sigma(EEG)=delta+theta+alpha+beta+gamma) and EMG (10-100Hz) yielding dimensionless ratios: WAKE-index=(EMGxgamma)/theta; NREM-index=(deltaxalpha)/gamma(2); REM-index=theta(3)/(deltaxalphaxEMG); artifact-index=[(2xEEG(lo))+beta]*(gamma/Sigma(EEG)). The index values are re-scaled and normalized, thereby dispensing with the need for animal-specific threshold values. The system was validated by direct comparison with visually scored data in 9 rats. Overall, the computer and visual scores were 96% concordant, which is similar to inter-rater agreement in visual scoring. False-positive error rate was system in studies lasting 5 weeks. The system was implemented and further validated in a study of sleep architecture in 7 rats under a 12:12h LD cycle. PMID:19703489

  20. Histogram analysis with automated extraction of brain-tissue region from whole-brain CT images

    OpenAIRE

    Kondo, Masatoshi; Yamashita, Koji; Yoshiura, Takashi; Hiwatash, Akio; Shirasaka, Takashi; Arimura, Hisao; Nakamura, Yasuhiko; Honda, Hiroshi

    2015-01-01

    To determine whether an automated extraction of the brain-tissue region from CT images is useful for the histogram analysis of the brain-tissue region was studied. We used the CT images of 11 patients. We developed an automatic brain-tissue extraction algorithm. We evaluated the similarity index of this automated extraction method relative to manual extraction, and we compared the mean CT number of all extracted pixels and the kurtosis and skewness of the distribution of CT numbers of all ext...

  1. Analysis of natural waters with an automated inductively-coupled plasma spectrometer system

    International Nuclear Information System (INIS)

    A commercial ICP spectrometer system has been automated to provide unattended operation and data collection following initializing commands and loading of the sample changer. Automation is provided by a microcomputer which permits interconnection of a sample changer, a card reader, a high-speed printer terminal, a dual floppy disk drive, and the spectrometer's basic computer. Application of the system to the analysis of natural water samples is described. Accuracy and precision data both for short and long periods as determined with standard and reference water samples is presented. Analytical data presentation formats can be altered with the system. Some aspects of data handling and manipulation external to the system are outlined

  2. Detailed interrogation of trypanosome cell biology via differential organelle staining and automated image analysis

    Directory of Open Access Journals (Sweden)

    Wheeler Richard J

    2012-01-01

    Full Text Available Abstract Background Many trypanosomatid protozoa are important human or animal pathogens. The well defined morphology and precisely choreographed division of trypanosomatid cells makes morphological analysis a powerful tool for analyzing the effect of mutations, chemical insults and changes between lifecycle stages. High-throughput image analysis of micrographs has the potential to accelerate collection of quantitative morphological data. Trypanosomatid cells have two large DNA-containing organelles, the kinetoplast (mitochondrial DNA and nucleus, which provide useful markers for morphometric analysis; however they need to be accurately identified and often lie in close proximity. This presents a technical challenge. Accurate identification and quantitation of the DNA content of these organelles is a central requirement of any automated analysis method. Results We have developed a technique based on double staining of the DNA with a minor groove binding (4'', 6-diamidino-2-phenylindole (DAPI and a base pair intercalating (propidium iodide (PI or SYBR green fluorescent stain and color deconvolution. This allows the identification of kinetoplast and nuclear DNA in the micrograph based on whether the organelle has DNA with a more A-T or G-C rich composition. Following unambiguous identification of the kinetoplasts and nuclei the resulting images are amenable to quantitative automated analysis of kinetoplast and nucleus number and DNA content. On this foundation we have developed a demonstrative analysis tool capable of measuring kinetoplast and nucleus DNA content, size and position and cell body shape, length and width automatically. Conclusions Our approach to DNA staining and automated quantitative analysis of trypanosomatid morphology accelerated analysis of trypanosomatid protozoa. We have validated this approach using Leishmania mexicana, Crithidia fasciculata and wild-type and mutant Trypanosoma brucei. Automated analysis of T. brucei

  3. Automated interpretation of ventilation-perfusion lung scintigrams for the diagnosis of pulmonary embolism using artificial neural networks

    International Nuclear Information System (INIS)

    The purpose of this study was to develop a completely automated method for the interpretation of ventilation-perfusion (V-P) lung scintigrams used in the diagnosis of pulmonary embolism. An artificial neural network was trained for the diagnosis of pulmonary embolism using 18 automatically obtained features from each set of V-P scintigrams. The techniques used to process the images included their alignment to templates, the construction of quotient images based on the ventilation and perfusion images, and the calculation of measures describing V-P mismatches in the quotient images. The templates represented lungs of normal size and shape without any pathological changes. Images that could not be properly aligned to the templates were detected and excluded automatically. After exclusion of those V-P scintigrams not properly aligned to the templates, 478 V-P scintigrams remained in a training group of consecutive patients with suspected pulmonary embolism, and a further 87 V-P scintigrams formed a separate test group comprising patients who had undergone pulmonary angiography. The performance of the neural network, measured as the area under the receiver operating characteristic curve, was 0.87 (95% confidence limits 0.82-0.92) in the training group and 0.79 (0.69-0.88) in the test group. It is concluded that a completely automated method can be used for the interpretation of V-P scintigrams. The performance of this method is similar to others previously presented, whereby features were extracted manually. (orig.)

  4. Automated interpretation of ventilation-perfusion lung scintigrams for the diagnosis of pulmonary embolism using artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Holst, H.; Jaerund, A.; Traegil, K.; Evander, E.; Edenbrandt, L. [Department of Clinical Physiology, Lund University, Lund (Sweden); Aastroem, K.; Heyden, A.; Kahl, F.; Sparr, G. [Department of Mathematics, Lund Institute of Technology, Lund (Sweden); Palmer, J. [Department of Radiation Physics, Lund University, Lund (Sweden)

    2000-04-01

    The purpose of this study was to develop a completely automated method for the interpretation of ventilation-perfusion (V-P) lung scintigrams used in the diagnosis of pulmonary embolism. An artificial neural network was trained for the diagnosis of pulmonary embolism using 18 automatically obtained features from each set of V-P scintigrams. The techniques used to process the images included their alignment to templates, the construction of quotient images based on the ventilation and perfusion images, and the calculation of measures describing V-P mismatches in the quotient images. The templates represented lungs of normal size and shape without any pathological changes. Images that could not be properly aligned to the templates were detected and excluded automatically. After exclusion of those V-P scintigrams not properly aligned to the templates, 478 V-P scintigrams remained in a training group of consecutive patients with suspected pulmonary embolism, and a further 87 V-P scintigrams formed a separate test group comprising patients who had undergone pulmonary angiography. The performance of the neural network, measured as the area under the receiver operating characteristic curve, was 0.87 (95% confidence limits 0.82-0.92) in the training group and 0.79 (0.69-0.88) in the test group. It is concluded that a completely automated method can be used for the interpretation of V-P scintigrams. The performance of this method is similar to others previously presented, whereby features were extracted manually. (orig.)

  5. Automated red blood cell analysis compared with routine red blood cell morphology by smear review

    Directory of Open Access Journals (Sweden)

    Dr.Poonam Radadiya

    2015-01-01

    Full Text Available The RBC histogram is an integral part of automated haematology analysis and is now routinely available on all automated cell counters. This histogram and other associated complete blood count (CBC parameters have been found abnormal in various haematological conditions and may provide major clues in the diagnosis and management of significant red cell disorders. Performing manual blood smears is important to ensure the quality of blood count results and to make presumptive diagnosis. In this article we have taken 100 samples for comparative study between RBC histograms obtained by automated haematology analyzer with peripheral blood smear. This article discusses some morphological features of dimorphism and the ensuing characteristic changes in their RBC histograms.

  6. An automated analysis of wide area motion imagery for moving subject detection

    Science.gov (United States)

    Tahmoush, Dave

    2015-05-01

    Automated analysis of wide area motion imagery (WAMI) can significantly reduce the effort required for converting data into reliable decisions. We register consecutive WAMI frames and use false-color frame comparisons to enhance the visual detection of possible subjects in the imagery. The large number of WAMI detections produces the need for a prioritization of detections for further inspection. We create a priority queue of detections for automated revisit with smaller field-ofview assets based on the locations of the movers as well as the probability of the detection. This automated queue works within an operator's preset prioritizations but also allows the flexibility to dynamically respond to new events as well as incorporating additional information into the surveillance tasking.

  7. Trend Analysis on the Automation of the Notebook PC Production Process

    Directory of Open Access Journals (Sweden)

    Chin-Ching Yeh

    2012-09-01

    Full Text Available Notebook PCs are among the Taiwanese electronic products that generate the highest production value and market share. According to the ITRI IEK statistics, the domestic Notebook PC - production value in 2011 was about NT $2.3 trillion. Of about 200 million notebook PCs in global markets in 2011, Taiwan’s notebook PC output accounts for more than 90% of them, meaning that nine out of every ten notebook PCs in the world are manufactured by Taiwanese companies. For such a large industry with its output value and quantity, the degree of automation in production processes is not high. This means that there is still a great room for the automation of the notebook PC production process, or that the degree of automation of the production process of the laptops cannot be enhanced. This paper presents an analysis of the situation.

  8. Automated identification of mitochondrial regions in complex intracellular space by texture analysis

    Science.gov (United States)

    Pham, Tuan D.

    2014-01-01

    Automated processing and quantification of biological images have been rapidly increasing the attention of researchers in image processing and pattern recognition because the roles of computerized image and pattern analyses are critical for new biological findings and drug discovery based on modern high-throughput and highcontent image screening. This paper presents a study of the automated detection of regions of mitochondria, which are a subcellular structure of eukaryotic cells, in microscopy images. The automated identification of mitochondria in intracellular space that is captured by the state-of-the-art combination of focused ion beam and scanning electron microscope imaging reported here is the first of its type. Existing methods and a proposed algorithm for texture analysis were tested with the real intracellular images. The high correction rate of detecting the locations of the mitochondria in a complex environment suggests the effectiveness of the proposed study.

  9. The International Trade Network: weighted network analysis and modelling

    International Nuclear Information System (INIS)

    Tools of the theory of critical phenomena, namely the scaling analysis and universality, are argued to be applicable to large complex web-like network structures. Using a detailed analysis of the real data of the International Trade Network we argue that the scaled link weight distribution has an approximate log-normal distribution which remains robust over a period of 53 years. Another universal feature is observed in the power-law growth of the trade strength with gross domestic product, the exponent being similar for all countries. Using the 'rich-club' coefficient measure of the weighted networks it has been shown that the size of the rich-club controlling half of the world's trade is actually shrinking. While the gravity law is known to describe well the social interactions in the static networks of population migration, international trade, etc, here for the first time we studied a non-conservative dynamical model based on the gravity law which excellently reproduced many empirical features of the ITN

  10. Automating case reports for the analysis of digital evidence

    OpenAIRE

    Cassidy, Regis H. Friend

    2005-01-01

    The reporting process during computer analysis is critical in the practice of digital forensics. Case reports are used to review the process and results of an investigation and serve multiple purposes. The investigator may refer to these reports to monitor the progress of his analysis throughout the investigation. When acting as an expert witness, the investigator will refer to organized documentation to recall past analysis. A lot of time can elapse between the analysis and the actual testim...

  11. Policy-Based Automation of Dynamique and Multipoint Virtual Private Network Simulation on OPNET Modeler

    OpenAIRE

    Ayoub BAHNASSE; Najib EL KAMOUN

    2014-01-01

    The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different ...

  12. Comparison of semi-automated image analysis and manual methods for tissue quantification in pancreatic carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Sims, A.J. [Regional Medical Physics Department, Freeman Hospital, Newcastle upon Tyne (United Kingdom)]. E-mail: a.j.sims@newcastle.ac.uk; Murray, A. [Regional Medical Physics Department, Freeman Hospital, Newcastle upon Tyne (United Kingdom); Bennett, M.K. [Department of Histopathology, Newcastle upon Tyne Hospitals NHS Trust, Newcastle upon Tyne (United Kingdom)

    2002-04-21

    Objective measurements of tissue area during histological examination of carcinoma can yield valuable prognostic information. However, such measurements are not made routinely because the current manual approach is time consuming and subject to large statistical sampling error. In this paper, a semi-automated image analysis method for measuring tissue area in histological samples is applied to the measurement of stromal tissue, cell cytoplasm and lumen in samples of pancreatic carcinoma and compared with the standard manual point counting method. Histological samples from 26 cases of pancreatic carcinoma were stained using the sirius red, light-green method. Images from each sample were captured using two magnifications. Image segmentation based on colour cluster analysis was used to subdivide each image into representative colours which were classified manually into one of three tissue components. Area measurements made using this technique were compared to corresponding manual measurements and used to establish the comparative accuracy of the semi-automated image analysis technique, with a quality assurance study to measure the repeatability of the new technique. For both magnifications and for each tissue component, the quality assurance study showed that the semi-automated image analysis algorithm had better repeatability than its manual equivalent. No significant bias was detected between the measurement techniques for any of the comparisons made using the 26 cases of pancreatic carcinoma. The ratio of manual to semi-automatic repeatability errors varied from 2.0 to 3.6. Point counting would need to be increased to be between 400 and 1400 points to achieve the same repeatability as for the semi-automated technique. The results demonstrate that semi-automated image analysis is suitable for measuring tissue fractions in histological samples prepared with coloured stains and is a practical alternative to manual point counting. (author)

  13. Comparison of semi-automated image analysis and manual methods for tissue quantification in pancreatic carcinoma

    International Nuclear Information System (INIS)

    Objective measurements of tissue area during histological examination of carcinoma can yield valuable prognostic information. However, such measurements are not made routinely because the current manual approach is time consuming and subject to large statistical sampling error. In this paper, a semi-automated image analysis method for measuring tissue area in histological samples is applied to the measurement of stromal tissue, cell cytoplasm and lumen in samples of pancreatic carcinoma and compared with the standard manual point counting method. Histological samples from 26 cases of pancreatic carcinoma were stained using the sirius red, light-green method. Images from each sample were captured using two magnifications. Image segmentation based on colour cluster analysis was used to subdivide each image into representative colours which were classified manually into one of three tissue components. Area measurements made using this technique were compared to corresponding manual measurements and used to establish the comparative accuracy of the semi-automated image analysis technique, with a quality assurance study to measure the repeatability of the new technique. For both magnifications and for each tissue component, the quality assurance study showed that the semi-automated image analysis algorithm had better repeatability than its manual equivalent. No significant bias was detected between the measurement techniques for any of the comparisons made using the 26 cases of pancreatic carcinoma. The ratio of manual to semi-automatic repeatability errors varied from 2.0 to 3.6. Point counting would need to be increased to be between 400 and 1400 points to achieve the same repeatability as for the semi-automated technique. The results demonstrate that semi-automated image analysis is suitable for measuring tissue fractions in histological samples prepared with coloured stains and is a practical alternative to manual point counting. (author)

  14. Network analysis of GWAS data

    OpenAIRE

    Leiserson, Mark DM; Eldridge, Jonathan V.; Ramachandran, Sohini; Raphael, Benjamin J

    2013-01-01

    Genome-wide association studies (GWAS) identify genetic variants that distinguish a control population from a population with a specific trait. Two challenges in GWAS are: (1) identification of the causal variant within a longer haplotype that is associated with the trait; (2) identification of causal variants for polygenic traits that are caused by variants in multiple genes within a pathway. We review recent methods that use information in protein–protein and protein–DNA interaction network...

  15. Economic Perspectives on Automated Demand Responsive Transportation and Shared Taxi Services - Analytical models and simulations for policy analysis

    OpenAIRE

    Jokinen, Jani-Pekka

    2016-01-01

    The automated demand responsive transportation (DRT) and modern shared taxi services provide shared trips for passengers, adapting dynamically to trip requests by routing a fleet of vehicles operating without any fixed routes or schedules. Compared with traditional public transportation, these new services provide trips without transfers and free passengers from the necessity of using timetables and maps of route networks. Furthermore, automated DRT applies real-time traffic information in ve...

  16. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  17. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  18. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  19. Multimodal microscopy for automated histologic analysis of prostate cancer

    Directory of Open Access Journals (Sweden)

    Sinha Saurabh

    2011-02-01

    Full Text Available Abstract Background Prostate cancer is the single most prevalent cancer in US men whose gold standard of diagnosis is histologic assessment of biopsies. Manual assessment of stained tissue of all biopsies limits speed and accuracy in clinical practice and research of prostate cancer diagnosis. We sought to develop a fully-automated multimodal microscopy method to distinguish cancerous from non-cancerous tissue samples. Methods We recorded chemical data from an unstained tissue microarray (TMA using Fourier transform infrared (FT-IR spectroscopic imaging. Using pattern recognition, we identified epithelial cells without user input. We fused the cell type information with the corresponding stained images commonly used in clinical practice. Extracted morphological features, optimized by two-stage feature selection method using a minimum-redundancy-maximal-relevance (mRMR criterion and sequential floating forward selection (SFFS, were applied to classify tissue samples as cancer or non-cancer. Results We achieved high accuracy (area under ROC curve (AUC >0.97 in cross-validations on each of two data sets that were stained under different conditions. When the classifier was trained on one data set and tested on the other data set, an AUC value of ~0.95 was observed. In the absence of IR data, the performance of the same classification system dropped for both data sets and between data sets. Conclusions We were able to achieve very effective fusion of the information from two different images that provide very different types of data with different characteristics. The method is entirely transparent to a user and does not involve any adjustment or decision-making based on spectral data. By combining the IR and optical data, we achieved high accurate classification.

  20. Radial basis function (RBF) neural network control for mechanical systems design, analysis and Matlab simulation

    CERN Document Server

    Liu, Jinkun

    2013-01-01

    Radial Basis Function (RBF) Neural Network Control for Mechanical Systems is motivated by the need for systematic design approaches to stable adaptive control system design using neural network approximation-based techniques. The main objectives of the book are to introduce the concrete design methods and MATLAB simulation of stable adaptive RBF neural control strategies. In this book, a broad range of implementable neural network control design methods for mechanical systems are presented, such as robot manipulators, inverted pendulums, single link flexible joint robots, motors, etc. Advanced neural network controller design methods and their stability analysis are explored. The book provides readers with the fundamentals of neural network control system design.   This book is intended for the researchers in the fields of neural adaptive control, mechanical systems, Matlab simulation, engineering design, robotics and automation. Jinkun Liu is a professor at Beijing University of Aeronautics and Astronauti...

  1. Social Networks Analysis in Discovering the Narrative Structure of Literary Fiction

    CERN Document Server

    Jarynowski, Andrzej

    2016-01-01

    In our paper we would like to make a cross-disciplinary leap and use the tools of network theory to understand and explore narrative structure in literary fiction, an approach that is still underestimated. However, the systems in fiction are sensitive to readers subjectivity and attention must to be paid to different methods of extracting networks. The project aims at investigating into different ways social interactions are read in texts by comparing networks produced by automated algorithms-natural language processing with those created by surveying more subjective human responses. Conversation networks from fiction have been already extracted by scientists, but the more general framework surrounding these interactions was missing. We propose several NLP methods for detecting interactions and test them against a range of human perceptions. In doing so, we have pointed to some limitations of using network analysis to test literary theory, e.g. interaction, which corresponds to the plot, does not form climax.

  2. Historical Network Analysis of the Web

    DEFF Research Database (Denmark)

    Brügger, Niels

    2013-01-01

    This article discusses some of the fundamental methodological challenges related to doing historical network analyses of the web based on material in web archives. Since the late 1990s many countries have established extensive national web archives, and software supported network analysis...... revolve around the specific nature of archived web material. On the basis of an introduction to the processes involved in web archiving as well as of the characteristics of archived web material, the article outlines and scrutinizes some of the major challenges which may arise when doing network analysis...... in web archives, among others such issues as completeness, construction of a corpus, temporal and spatial inconsistencies, and cross-archive analyses. The article uses an ongoing case study to feed the methodological discussion, namely the political network on the web which was available to a voter...

  3. Cross-disciplinary detection and analysis of network motifs.

    Science.gov (United States)

    Tran, Ngoc Tam L; DeLuccia, Luke; McDonald, Aidan F; Huang, Chun-Hsi

    2015-01-01

    The detection of network motifs has recently become an important part of network analysis across all disciplines. In this work, we detected and analyzed network motifs from undirected and directed networks of several different disciplines, including biological network, social network, ecological network, as well as other networks such as airlines, power grid, and co-purchase of political books networks. Our analysis revealed that undirected networks are similar at the basic three and four nodes, while the analysis of directed networks revealed the distinction between networks of different disciplines. The study showed that larger motifs contained the three-node motif as a subgraph. Topological analysis revealed that similar networks have similar small motifs, but as the motif size increases, differences arise. Pearson correlation coefficient showed strong positive relationship between some undirected networks but inverse relationship between some directed networks. The study suggests that the three-node motif is a building block of larger motifs. It also suggests that undirected networks share similar low-level structures. Moreover, similar networks share similar small motifs, but larger motifs define the unique structure of individuals. Pearson correlation coefficient suggests that protein structure networks, dolphin social network, and co-authorships in network science belong to a superfamily. In addition, yeast protein-protein interaction network, primary school contact network, Zachary's karate club network, and co-purchase of political books network can be classified into a superfamily. PMID:25983553

  4. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  5. Kinetic analysis of complex metabolic networks

    Energy Technology Data Exchange (ETDEWEB)

    Stephanopoulos, G. [MIT, Cambridge, MA (United States)

    1996-12-31

    A new methodology is presented for the analysis of complex metabolic networks with the goal of metabolite overproduction. The objective is to locate a small number of reaction steps in a network that have maximum impact on network flux amplification and whose rate can also be increased without functional network derangement. This method extends the concepts of Metabolic Control Analysis to groups of reactions and offers the means for calculating group control coefficients as measures of the control exercised by groups of reactions on the overall network fluxes and intracellular metabolite pools. It is further demonstrated that the optimal strategy for the effective increase of network fluxes, while maintaining an uninterrupted supply of intermediate metabolites, is through the coordinated amplification of multiple (as opposed to a single) reaction steps. Satisfying this requirement invokes the concept of the concentration control to coefficient, which emerges as a critical parameter in the identification of feasible enzymatic modifications with maximal impact on the network flux. A case study of aromatic aminoacid production is provided to illustrate these concepts.

  6. Automating Mid- and Long-Range Scheduling for NASA's Deep Space Network

    Science.gov (United States)

    Johnston, Mark D.; Tran, Daniel; Arroyo, Belinda; Sorensen, Sugi; Tay, Peter; Carruth, Butch; Coffman, Adam; Wallace, Mike

    2012-01-01

    NASA has recently deployed a new mid-range scheduling system for the antennas of the Deep Space Network (DSN), called Service Scheduling Software, or S(sup 3). This system is architected as a modern web application containing a central scheduling database integrated with a collaborative environment, exploiting the same technologies as social web applications but applied to a space operations context. This is highly relevant to the DSN domain since the network schedule of operations is developed in a peer-to-peer negotiation process among all users who utilize the DSN (representing 37 projects including international partners and ground-based science and calibration users). The initial implementation of S(sup 3) is complete and the system has been operational since July 2011. S(sup 3) has been used for negotiating schedules since April 2011, including the baseline schedules for three launching missions in late 2011. S(sup 3) supports a distributed scheduling model, in which changes can potentially be made by multiple users based on multiple schedule "workspaces" or versions of the schedule. This has led to several challenges in the design of the scheduling database, and of a change proposal workflow that allows users to concur with or to reject proposed schedule changes, and then counter-propose with alternative or additional suggested changes. This paper describes some key aspects of the S(sup 3) system and lessons learned from its operational deployment to date, focusing on the challenges of multi-user collaborative scheduling in a practical and mission-critical setting. We will also describe the ongoing project to extend S(sup 3) to encompass long-range planning, downtime analysis, and forecasting, as the next step in developing a single integrated DSN scheduling tool suite to cover all time ranges.

  7. Medical image analysis with artificial neural networks.

    Science.gov (United States)

    Jiang, J; Trundle, P; Ren, J

    2010-12-01

    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging. PMID:20713305

  8. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    the possibility to do population studies on large samples of stars and such population studies demand a consistent analysis. By consistent analysis we understand an analysis that can be performed without the need to make any subjective choices on e.g. mode identification and an analysis where the uncertainties......, radius, luminosity, effective temperature, surface gravity and age based on grid modeling. All the tools take into account the window function of the observations which means that they work equally well for space-based photometry observations from e.g. the NASA Kepler satellite and ground-based velocity...

  9. Crawling Facebook for Social Network Analysis Purposes

    OpenAIRE

    Catanese, Salvatore A.; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo; Provetti, Alessandro

    2011-01-01

    We describe our work in the collection and analysis of massive data describing the connections between participants to online social networks. Alternative approaches to social network data collection are defined and evaluated in practice, against the popular Facebook Web site. Thanks to our ad-hoc, privacy-compliant crawlers, two large samples, comprising millions of connections, have been collected; the data is anonymous and organized as an undirected graph. We describe a set of tools that w...

  10. Information Flow Analysis of Interactome Networks

    OpenAIRE

    Missiuro, Patrycja Vasilyev; Liu, Kesheng; Zou, Lihua; Zhao, Guoyan; Ge, Hui; Ross, Brian C.; Liu, Jun

    2009-01-01

    Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins centr...

  11. Multilayer Analysis and Visualization of Networks

    CERN Document Server

    De Domenico, Manlio; Arenas, Alex

    2014-01-01

    Multilayer relationships among and information about biological entities must be accompanied by the means to analyze, visualize, and obtain insights from such data. We report a methodology and a collection of algorithms for the analysis of multilayer networks in our new open-source software (muxViz). We demonstrate the ability of muxViz to analyze and interactively visualize multilayer data using empirical genetic and neuronal networks.

  12. Social network analysis of study environment

    OpenAIRE

    Blaženka Divjak; Petra Peharda

    2010-01-01

    Student working environment influences student learning and achievement level. In this respect social aspects of students’ formal and non-formal learning play special role in learning environment. The main research problem of this paper is to find out if students' academic performance influences their position in different students' social networks. Further, there is a need to identify other predictors of this position. In the process of problem solving we use the Social Network Analysi...

  13. Density functional and neural network analysis

    DEFF Research Database (Denmark)

    Jalkanen, K. J.; Suhai, S.; Bohr, Henrik

    1997-01-01

    dichroism (VCD) intensities. The large changes due to hydration on the structures, relative stability of conformers, and in the VA and VCD spectra observed experimentally are reproduced by the DFT calculations. Furthermore a neural network was constructed for reproducing the inverse scattering data (infer...... the structural coordinates from spectroscopic data) that the DFT method could produce. Finally the neural network performances are used to monitor a sensitivity or dependence analysis of the importance of secondary structures....

  14. Performance Analysis of Asynchronous Multicarrier Wireless Networks

    OpenAIRE

    Lin, Xingqin; Jiang, Libin; Andrews, Jeffrey G.

    2014-01-01

    This paper develops a novel analytical framework for asynchronous wireless networks deploying multicarrier transmission. Nodes in the network have different notions of timing, so from the viewpoint of a typical receiver, the received signals from different transmitters are asynchronous, leading to a loss of orthogonality between subcarriers. We first develop a detailed link-level analysis based on OFDM, based on which we propose a tractable system-level signal-to-interference-plus-noise ratio...

  15. An automated software for analysis of experimental data on decay heat from spent nuclear fuel

    OpenAIRE

    Llerena Herrera, Isbel

    2012-01-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has developed a method for final disposal of spent nuclear fuel. This technique requires accurate measurement of the residual decay heat of every assembly. For this purpose, depletion codes as well as calorimetric and gamma-ray spectroscopy experimental methods have been developed and evaluated. In this work a prototype analysis tool has been developed to automate the analysis of both calorimetric and gamma-ray spectroscopy measureme...

  16. Automated analysis of background EEG and reactivity during therapeutic hypothermia in comatose patients after cardiac arrest

    OpenAIRE

    Noirhomme, Quentin; Lehembre, Rémy; Lugo, Zulay; Lesenfants, Damien; Luxen, André; Laureys, Steven; Oddo, Mauro; Rossetti, Andrea

    2014-01-01

    Visual analysis of electroencephalography (EEG) background and reactivity during therapeutic hypothermia provides important outcome information, but is time-consuming and not always consistent between reviewers. Automated EEG analysis may help quantify the brain damage. Forty-six comatose patients in therapeutic hypothermia, after cardiac arrest, were included in the study. EEG background was quantified with burst-suppression ratio (BSR) and approximate entropy, both used to monitor anesthesi...

  17. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.; Friis, Carsten

    2003-01-01

    , statistical analysis and visualization of the data. The results are run against databases of signal transduction pathways, metabolic pathways and promoter sequences in order to extract more information. The results of the entire analysis are summarized in report form and returned to the user....

  18. An independent evaluation of a new method for automated interpretation of lung scintigrams using artificial neural networks

    International Nuclear Information System (INIS)

    The purpose of this study was to evaluate a new automated method for the interpretation of lung perfusion scintigrams using patients from a hospital other than that where the method was developed, and then to compare the performance of the technique against that of experienced physicians. A total of 1,087 scintigrams from patients with suspected pulmonary embolism comprised the training group. The test group consisted of scintigrams from 140 patients collected in a hospital different to that from which the training group had been drawn. An artificial neural network was trained using 18 automatically obtained features from each set of perfusion scintigrams. The image processing techniques included alignment to templates, construction of quotient images based on the perfusion/template images, and finally calculation of features describing segmental perfusion defects in the quotient images. The templates represented lungs of normal size and shape without any pathological changes. The performance of the neural network was compared with that of three experienced physicians who read the same test scintigrams according to the modified PIOPED criteria using, in addition to perfusion images, ventilation images when available and chest radiographs for all patients. Performances were measured as area under the receiver operating characteristic curve. The performance of the neural network evaluated in the test group was 0.88 (95% confidence limits 0.81-0.94). The performance of the three experienced experts was in the range 0.87-0.93 when using the perfusion images, chest radiographs and ventilation images when available. Perfusion scintigrams can be interpreted regarding the diagnosis of pulmonary embolism by the use of an automated method also in a hospital other than that where it was developed. The performance of this method is similar to that of experienced physicians even though the physicians, in addition to perfusion images, also had access to ventilation images for

  19. Quality assurance of automated gamma-ray spectrometric analysis

    International Nuclear Information System (INIS)

    Fully automatic gamma-ray spectrometric analysis procedures perform complete processing of the spectrum without intervention of the operator. In order to maintain the reliability of the final results the analysis checks the intermediate results automatically. When a disagreement is identified by such a check the uncertainty of the intermediate results is increased in order to accommodate the disagreement. The increased uncertainty is propagated into the uncertainty of the final results in order to take into account the disagreement. This approach was implemented in Canberra's Genie ESP gamma-ray spectrometry package for examining the results of the peak analysis. In addition to this intermediate check also a-posteriori checks of the final results can be performed by statistical analysis. Such analysis shows whether the results are under statistical control and can discover sources of variability which are not taken into account in the uncertainty budget

  20. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    Science.gov (United States)

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. PMID:26429557

  1. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  2. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.;

    2003-01-01

    GenePublisher, a system for automatic analysis of data from DNA microarray experiments, has been implemented with a web interface at http://www.cbs.dtu.dk/services/GenePublisher. Raw data are uploaded to the server together with aspecification of the data. The server performs normalization......, statistical analysis and visualization of the data. The results are run against databases of signal transduction pathways, metabolic pathways and promoter sequences in order to extract more information. The results of the entire analysis are summarized in report form and returned to the user....

  3. Carotid artery stenosis: reproducibility of automated 3D CT angiography analysis method

    International Nuclear Information System (INIS)

    The aim of this study was to assess the reproducibility and anatomical accuracy of automated 3D CT angiography analysis software in the evaluation of carotid artery stenosis with reference to rotational DSA (rDSA). Seventy-two vessels in 36 patients with symptomatic carotid stenosis were evaluated by 3D CT angiography and conventional DSA (cDSA). Thirty-one patients also underwent rotational 3D DSA (rDSA). Multislice CT was performed with bolus tracking and slice thickness of 1.5 mm (1-mm collimation, table feed 5 mm/s) and reconstruction interval of 1.0 mm. Two observers independently performed the stenosis measurements on 3D CTA and on MPR rDSA according to the NASCET criteria. The first measurements on CTA utilized an analysis program with automatic stenosis recognition and quantitation. In the subsequent measurements, manual corrections were applied when necessary. Interfering factors for stenosis quantitation, such as calcifications, ulcerations, and adjacent vessels, were registered. Intraobserver and interobserver correlation for CTA were 0.89 and 0.90, respectively. (p<0.001). The interobserver correlation between two observers for MPR rDSA was 0.90 (p<0.001). The intertechnique correlation between CTA and rDSA was 0.69 (p<0.001) using automated measurements but increased to 0.81 (p<0.001) with the manually corrected measurements. Automated stenosis recognition achieved a markedly poorer correlation with MPR rDSA in carotids with interfering factors than those in cases where there were no such factors. Automated 3D CT angiography analysis methods are highly reproducible. Manually corrected measurements facilitated avoidance of the interfering factors, such as ulcerations, calcifications, and adjacent vessels, and thus increased anatomical accuracy of arterial delineation by automated CT angiography with reference to MPR rDSA. (orig.)

  4. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  5. Infrascope: Full-Spectrum Phonocardiography with Automated Signal Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Using digital signal analysis tools, we will generate a repeatable output from the infrascope and compare it to the output of a traditional electrocardiogram, and...

  6. Implicit media frames: Automated analysis of public debate on artificial sweeteners

    CERN Document Server

    Hellsten, Iina; Leydesdorff, Loet

    2010-01-01

    The framing of issues in the mass media plays a crucial role in the public understanding of science and technology. This article contributes to research concerned with diachronic analysis of media frames by making an analytical distinction between implicit and explicit media frames, and by introducing an automated method for analysing diachronic changes of implicit frames. In particular, we apply a semantic maps method to a case study on the newspaper debate about artificial sweeteners, published in The New York Times (NYT) between 1980 and 2006. Our results show that the analysis of semantic changes enables us to filter out the dynamics of implicit frames, and to detect emerging metaphors in public debates. Theoretically, we discuss the relation between implicit frames in public debates and codification of information in scientific discourses, and suggest further avenues for research interested in the automated analysis of frame changes and trends in public debates.

  7. NAPS: Network Analysis of Protein Structures.

    Science.gov (United States)

    Chakrabarty, Broto; Parekh, Nita

    2016-07-01

    Traditionally, protein structures have been analysed by the secondary structure architecture and fold arrangement. An alternative approach that has shown promise is modelling proteins as a network of non-covalent interactions between amino acid residues. The network representation of proteins provide a systems approach to topological analysis of complex three-dimensional structures irrespective of secondary structure and fold type and provide insights into structure-function relationship. We have developed a web server for network based analysis of protein structures, NAPS, that facilitates quantitative and qualitative (visual) analysis of residue-residue interactions in: single chains, protein complex, modelled protein structures and trajectories (e.g. from molecular dynamics simulations). The user can specify atom type for network construction, distance range (in Å) and minimal amino acid separation along the sequence. NAPS provides users selection of node(s) and its neighbourhood based on centrality measures, physicochemical properties of amino acids or cluster of well-connected residues (k-cliques) for further analysis. Visual analysis of interacting domains and protein chains, and shortest path lengths between pair of residues are additional features that aid in functional analysis. NAPS support various analyses and visualization views for identifying functional residues, provide insight into mechanisms of protein folding, domain-domain and protein-protein interactions for understanding communication within and between proteins. URL:http://bioinf.iiit.ac.in/NAPS/. PMID:27151201

  8. Automative Multi Classifier Framework for Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    R. Edbert Rajan

    2015-04-01

    Full Text Available Medical image processing is the technique used to create images of the human body for medical purposes. Nowadays, medical image processing plays a major role and a challenging solution for the critical stage in the medical line. Several researches have done in this area to enhance the techniques for medical image processing. However, due to some demerits met by some advanced technologies, there are still many aspects that need further development. Existing study evaluate the efficacy of the medical image analysis with the level-set shape along with fractal texture and intensity features to discriminate PF (Posterior Fossa tumor from other tissues in the brain image. To develop the medical image analysis and disease diagnosis, to devise an automotive subjective optimality model for segmentation of images based on different sets of selected features from the unsupervised learning model of extracted features. After segmentation, classification of images is done. The classification is processed by adapting the multiple classifier frameworks in the previous work based on the mutual information coefficient of the selected features underwent for image segmentation procedures. In this study, to enhance the classification strategy, we plan to implement enhanced multi classifier framework for the analysis of medical images and disease diagnosis. The performance parameter used for the analysis of the proposed enhanced multi classifier framework for medical image analysis is Multiple Class intensity, image quality, time consumption.

  9. Performance Analysis of 3G Communication Network

    Directory of Open Access Journals (Sweden)

    Toni Anwar

    2008-11-01

    Full Text Available In this project, third generation (3G technologies research had been carried out to design and optimization conditions for 3G network. The 3G wireless mobile communication networks are growing at an ever faster rate, and this is likely to continue in the foreseeable future. Some services such as e-mail, web browsing etc allow the transition of the network from circuit switched to packet switched operation, resulting in increased overall network performance. Higher reliability, better coverage and services, higher capacity, mobility management, and wireless multimedia are all parts of the network performance. Throughput and spectral efficiency are fundamental parameters in capacity planning for 3G cellular network deployments. In this project investigated the downlink (DL and uplink (UL throughput and spectral efficiency performance of the standard Universal Mobile Telecommunications system (UMTS system for different scenarios of user and different technologies. Then, compiles results and analyzes it. Thus, this analysis can significantly help system engineers to obtain crucial performance characteristics of 3G network.

  10. Social network analysis of study environment

    Directory of Open Access Journals (Sweden)

    Blaženka Divjak

    2010-06-01

    Full Text Available Student working environment influences student learning and achievement level. In this respect social aspects of students’ formal and non-formal learning play special role in learning environment. The main research problem of this paper is to find out if students' academic performance influences their position in different students' social networks. Further, there is a need to identify other predictors of this position. In the process of problem solving we use the Social Network Analysis (SNA that is based on the data we collected from the students at the Faculty of Organization and Informatics, University of Zagreb. There are two data samples: in the basic sample N=27 and in the extended sample N=52. We collected data on social-demographic position, academic performance, learning and motivation styles, student status (full-time/part-time, attitudes towards individual and teamwork as well as informal cooperation. Afterwards five different networks (exchange of learning materials, teamwork, informal communication, basic and aggregated social network were constructed. These networks were analyzed with different metrics and the most important were betweenness, closeness and degree centrality. The main result is, firstly, that the position in a social network cannot be forecast only by academic success and, secondly, that part-time students tend to form separate groups that are poorly connected with full-time students. In general, position of a student in social networks in study environment can influence student learning as well as her/his future employability and therefore it is worthwhile to be investigated.

  11. Network analysis of eight industrial symbiosis systems

    Science.gov (United States)

    Zhang, Yan; Zheng, Hongmei; Shi, Han; Yu, Xiangyi; Liu, Gengyuan; Su, Meirong; Li, Yating; Chai, Yingying

    2016-06-01

    Industrial symbiosis is the quintessential characteristic of an eco-industrial park. To divide parks into different types, previous studies mostly focused on qualitative judgments, and failed to use metrics to conduct quantitative research on the internal structural or functional characteristics of a park. To analyze a park's structural attributes, a range of metrics from network analysis have been applied, but few researchers have compared two or more symbioses using multiple metrics. In this study, we used two metrics (density and network degree centralization) to compare the degrees of completeness and dependence of eight diverse but representative industrial symbiosis networks. Through the combination of the two metrics, we divided the networks into three types: weak completeness, and two forms of strong completeness, namely "anchor tenant" mutualism and "equality-oriented" mutualism. The results showed that the networks with a weak degree of completeness were sparse and had few connections among nodes; for "anchor tenant" mutualism, the degree of completeness was relatively high, but the affiliated members were too dependent on core members; and the members in "equality-oriented" mutualism had equal roles, with diverse and flexible symbiotic paths. These results revealed some of the systems' internal structure and how different structures influenced the exchanges of materials, energy, and knowledge among members of a system, thereby providing insights into threats that may destabilize the network. Based on this analysis, we provide examples of the advantages and effectiveness of recent improvement projects in a typical Chinese eco-industrial park (Shandong Lubei).

  12. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    Directory of Open Access Journals (Sweden)

    Alfonso Baldi

    2010-03-01

    Full Text Available Dermoscopy (dermatoscopy, epiluminescence microscopy is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs, allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis. This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR.

  13. Automated development of artificial neural networks for clinical purposes: Application for predicting the outcome of choledocholithiasis surgery.

    Science.gov (United States)

    Vukicevic, Arso M; Stojadinovic, Miroslav; Radovic, Milos; Djordjevic, Milena; Cirkovic, Bojana Andjelkovic; Pejovic, Tomislav; Jovicic, Gordana; Filipovic, Nenad

    2016-08-01

    Among various expert systems (ES), Artificial Neural Network (ANN) has shown to be suitable for the diagnosis of concurrent common bile duct stones (CBDS) in patients undergoing elective cholecystectomy. However, their application in practice remains limited since the development of ANNs represents a slow process that requires additional expertize from potential users. The aim of this study was to propose an ES for automated development of ANNs and validate its performances on the problem of prediction of CBDS. Automated development of the ANN was achieved by applying the evolutionary assembling approach, which assumes optimal configuring of the ANN parameters by using Genetic algorithm. Automated selection of optimal features for the ANN training was performed using a Backward sequential feature selection algorithm. The assessment of the developed ANN included the evaluation of predictive ability and clinical utility. For these purposes, we collected data from 303 patients who underwent surgery in the period from 2008 to 2014. The results showed that the total bilirubin, alanine aminotransferase, common bile duct diameter, number of stones, size of the smallest calculus, biliary colic, acute cholecystitis and pancreatitis had the best prognostic value of CBDS. Compared to the alternative approaches, the ANN obtained by the proposed ES had better sensitivity and clinical utility, which are considered to be the most important for the particular problem. Besides the fact that it enabled the development of ANNs with better performances, the proposed ES significantly reduced the complexity of ANNs' development compared to previous studies that required manual selection of optimal features and/or ANN configuration. Therefore, it is concluded that the proposed ES represents a robust and user-friendly framework that, apart from the prediction of CBDS, could advance and simplify the application of ANNs for solving a wider range of problems. PMID:27261565

  14. Information flow analysis of interactome networks.

    Directory of Open Access Journals (Sweden)

    Patrycja Vasilyev Missiuro

    2009-04-01

    Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we

  15. Complex networks analysis of language complexity

    CERN Document Server

    Amancio, Diego R; Oliveira, Osvaldo N; Costa, Luciano da F; 10.1209/0295-5075/100/58002

    2013-01-01

    Methods from statistical physics, such as those involving complex networks, have been increasingly used in quantitative analysis of linguistic phenomena. In this paper, we represented pieces of text with different levels of simplification in co-occurrence networks and found that topological regularity correlated negatively with textual complexity. Furthermore, in less complex texts the distance between concepts, represented as nodes, tended to decrease. The complex networks metrics were treated with multivariate pattern recognition techniques, which allowed us to distinguish between original texts and their simplified versions. For each original text, two simplified versions were generated manually with increasing number of simplification operations. As expected, distinction was easier for the strongly simplified versions, where the most relevant metrics were node strength, shortest paths and diversity. Also, the discrimination of complex texts was improved with higher hierarchical network metrics, thus point...

  16. Design of Intelligent Network Performance Analysis Forecast Support System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A system designed for supporting the network performance analysis and forecast effort is pre sented, based on the combination of offline network analysis and online real-time performance forecast. The off-line analysis will perform analysis of specific network node performance, correlation analysis of relative network nodes performance and evolutionary mathematical modeling of long-term network performance mea surements. The online real-time network performance forecast will be based on one so-called hybrid predic tion modeling approach for short-term network performance prediction and trend analysis. Based on the module design, the system proposed has good intelligence, scalability and self-adaptability, which will offer highly effective network performance analysis and forecast tools for network managers, and is one ideal sup port platform for network performance analysis and forecast effort.

  17. Analysis of the automated systems of planning of spatial constructions

    Directory of Open Access Journals (Sweden)

    М.С. Барабаш

    2004-04-01

    Full Text Available  The article is devoted to the questions of analysis of existing SAPR and questions of development of new information technologies of planning on the basis of integration of programmatic complexes with the use of united informatively-logical model of object.

  18. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Franqueira, Virginia N.L.; Tun, Thein Tan; Wieringa, Roel J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about atta

  19. Semi-automated recognition of protozoa by image analysis

    OpenAIRE

    A.L. Amaral; Baptiste, C; Pons, M. N.; Nicolau, Ana; Lima, Nelson; Ferreira, E. C.; Mota, M.; H. Vivier

    1999-01-01

    A programme was created to semi-automatically analyse protozoal digitised images. Principal Component Analysis technique was used for species identification. After data collection and mathematical treatment, a threedimensional representation was generated and several protozoa (Opercularia, Colpidium, Tetrahymena, Prorodon, Glaucoma and Trachelophyllum) species could be positively identified.

  20. Towards an Automated Semiotic Analysis of the Romanian Political Discourse

    Directory of Open Access Journals (Sweden)

    Daniela Gifu

    2013-04-01

    Full Text Available As it is known, on the political scene the success of a speech can be measured by the degree in which the speaker is able to change attitudes, opinions, feelings and political beliefs in his auditorium. We suggest a range of analysis tools, all belonging to semiotics, from lexical-semantic, to syntactical and rhetorical, that integrated in the exploratory panoply of discursive weapons of a political speaker could influence the impact of her/his speeches over a sensible auditory. Our approach is based on the assumption that semiotics, in its quality of methodology and meta-language, can capitalize a situational analysis over the political discourse. Such an analysis assumes establishing the communication situation, in our case, the Parliament's vote in favour of suspending the Romanian President, through which we can describe an action of communication. We depict a platform, the Discourse Analysis Tool (DAT, which integrates a range of natural language processing tools with the intent to identify significant characteristics of the political discourse. The tool is able to produce comparative diagrams between the speeches of two or more subjects or analysing the same subject in different contexts. Only the lexical-semantic methods are operational in the platform today, but our investigation suggests new dimensions touching the syntactic, rhetorical and coherence perspective.

  1. Automated Frequency Domain Decomposition for Operational Modal Analysis

    DEFF Research Database (Denmark)

    Brincker, Rune; Andersen, Palle; Jacobsen, Niels-Jørgen

    2007-01-01

    The Frequency Domain Decomposition (FDD) technique is known as one of the most user friendly and powerful techniques for operational modal analysis of structures. However, the classical implementation of the technique requires some user interaction. The present paper describes an algorithm for...

  2. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  3. Automated Performance Monitoring Data Analysis and Reporting within the Open Source R Environment

    Science.gov (United States)

    Kennel, J.; Tonkin, M. J.; Faught, W.; Lee, A.; Biebesheimer, F.

    2013-12-01

    Environmental scientists encounter quantities of data at a rate that in many cases outpaces our ability to appropriately store, visualize and convey the information. The free software environment, R, provides a framework for efficiently processing, analyzing, depicting and reporting on data from a multitude of formats in the form of traceable and quality-assured data summary reports. Automated data summary reporting leverages document markup languages such as markdown, HTML, or LaTeX using R-scripts capable of completing a variety of simple or sophisticated data processing, analysis and visualization tasks. Automated data summary reports seamlessly integrate analysis into report production with calculation outputs - such as plots, maps and statistics - included alongside report text. Once a site-specific template is set up, including data types, geographic base data and reporting requirements, reports can be (re-)generated trivially as the data evolve. The automated data summary report can be a stand-alone report, or it can be incorporated as an attachment to an interpretive report prepared by a subject-matter expert, thereby providing the technical basis to report on and efficiently evaluate large volumes of data resulting in a concise interpretive report. Hence, the data summary report does not replace the scientist, but relieves them of repetitive data processing tasks, facilitating a greater level of analysis. This is demonstrated using an implementation developed for monthly groundwater data reporting for a multi-constituent contaminated site, highlighting selected analysis techniques that can be easily incorporated in a data summary report.

  4. Differential Network Analysis in Human Cancer Research

    Science.gov (United States)

    Gill, Ryan; Datta, Somnath; Datta, Susmita

    2016-01-01

    A complex disease like cancer is hardly caused by one gene or one protein singly. It is usually caused by the perturbation of the network formed by several genes or proteins. In the last decade several research teams have attempted to construct interaction maps of genes and proteins either experimentally or reverse engineer interaction maps using computational techniques. These networks were usually created under a certain condition such as an environmental condition, a particular disease, or a specific tissue type. Lately, however, there has been greater emphasis on finding the differential structure of the existing network topology under a novel condition or disease status to elucidate the perturbation in a biological system. In this review/tutorial article we briefly mention some of the research done in this area; we mainly illustrate the computational/statistical methods developed by our team in recent years for differential network analysis using publicly available gene expression data collected from a well known cancer study. This data includes a group of patients with acute lymphoblastic leukemia and a group with acute myeloid leukemia. In particular, we describe the statistical tests to detect the change in the network topology based on connectivity scores which measure the association or interaction between pairs of genes. The tests under various scores are applied to this data set to perform a differential network analysis on gene expression for human leukemia. We believe that, in the future, differential network analysis will be a standard way to view the changes in gene expression and protein expression data globally and these types of tests could be useful in analyzing the complex differential signatures. PMID:23530503

  5. Phylodynamic analysis of a viral infection network

    Directory of Open Access Journals (Sweden)

    TeiichiroShiino

    2012-07-01

    Full Text Available Viral infections by sexual and droplet transmission routes typically spread through a complex host-to-host contact network. Clarifying the transmission network and epidemiological parameters affecting the variations and dynamics of a specific pathogen is a major issue in the control of infectious diseases. However, conventional methods such as interview and/or classical phylogenetic analysis of viral gene sequences have inherent limitations and often fail to detect infectious clusters and transmission connections. Recent improvements in computational environments now permit the analysis of large datasets. In addition, novel analytical methods have been developed that serve to infer the evolutionary dynamics of virus genetic diversity using sample date information and sequence data. This type of framework, termed “phylodynamics”, helps connect some of the missing links on viral transmission networks, which are often hard to detect by conventional methods of epidemiology. With sufficient number of sequences available, one can use this new inference method to estimate theoretical epidemiological parameters such as temporal distributions of the primary infection, fluctuation of the pathogen population size, basic reproductive number, and the mean time span of disease infectiousness. Transmission networks estimated by this framework often have the properties of a scale-free network, which are characteristic of infectious and social communication processes. Network analysis based on phylodynamics has alluded to various suggestions concerning the infection dynamics associated with a given community and/or risk behavior. In this review, I will summarize the current methods available for identifying the transmission network using phylogeny, and present an argument on the possibilities of applying the scale-free properties to these existing frameworks.

  6. Automated analysis of protein subcellular location in time series images

    OpenAIRE

    Hu, Yanhua; Osuna-Highley, Elvira; Hua, Juchang; Nowicki, Theodore Scott; Stolz, Robert; McKayle, Camille; Murphy, Robert F.

    2010-01-01

    Motivation: Image analysis, machine learning and statistical modeling have become well established for the automatic recognition and comparison of the subcellular locations of proteins in microscope images. By using a comprehensive set of features describing static images, major subcellular patterns can be distinguished with near perfect accuracy. We now extend this work to time series images, which contain both spatial and temporal information. The goal is to use temporal features to improve...

  7. Automated analysis of image mammogram for breast cancer diagnosis

    Science.gov (United States)

    Nurhasanah, Sampurno, Joko; Faryuni, Irfana Diah; Ivansyah, Okto

    2016-03-01

    Medical imaging help doctors in diagnosing and detecting diseases that attack the inside of the body without surgery. Mammogram image is a medical image of the inner breast imaging. Diagnosis of breast cancer needs to be done in detail and as soon as possible for determination of next medical treatment. The aim of this work is to increase the objectivity of clinical diagnostic by using fractal analysis. This study applies fractal method based on 2D Fourier analysis to determine the density of normal and abnormal and applying the segmentation technique based on K-Means clustering algorithm to image abnormal for determine the boundary of the organ and calculate the area of organ segmentation results. The results show fractal method based on 2D Fourier analysis can be used to distinguish between the normal and abnormal breast and segmentation techniques with K-Means Clustering algorithm is able to generate the boundaries of normal and abnormal tissue organs, so area of the abnormal tissue can be determined.

  8. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.;

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as wel...

  9. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.;

    1998-01-01

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as wel...

  10. Automated counting and analysis of etched tracks in CR-39 plastic

    International Nuclear Information System (INIS)

    An image analysis system has been set up which is capable of automated counting and analysis of etched nuclear particle tracks in plastic. The system is composed of an optical microscope, CCD camera, frame grabber, personal computer, monitor, and printer. The frame grabber acquires and displays images at video rate. It has a spatial resolution of 512 x 512 pixels with 8 bits of digitisation corresponding to 256 grey levels. The software has been developed for general image processing and adapted for the present purpose. Comparisons of automated and visual microscope counting of tracks in chemically etched CR-39 detectors are presented with emphasis on results of interest for practical radon measurements or neutron dosimetry, e.g. calibration factors, background track densities and variations in background. (author)

  11. Automated Runtime Risk Management for Voice over IP Networks and Services

    OpenAIRE

    Dabbebi, Oussema; Badonnel, Rémi; Festor, Olivier

    2010-01-01

    Voice over IP (VoIP) has become a major paradigm for providing telephony services at a lower cost and with a higher flexibility. VoIP infrastructures are however exposed to multiple security issues both inherited from the IP layer and specific to the application layer. In the meantime, protection mechanisms are available but may seriously impact on the continuity and quality of such critical services. We propose in this paper an automated risk management schema for continuously adapting VoIP eq...

  12. Instrumentation, Field Network and Process Automation for the Cryogenic System of the LHC Test String

    OpenAIRE

    Suraci, A.; Bager, T.; Balle, Ch.; Blanco, E.; Casas, J.; Gomes, P.; Pelletier, S.; Serio, L.; Vauthier, N.

    2001-01-01

    CERN is now setting up String 2, a full-size prototype of a regular cell of the LHC arc. It is composed of two quadrupole, six dipole magnets, and a separate cryogenic distribution line (QRL) for the supply and recovery of the cryogen. An electrical feed box (DFB), with up to 38 High Temperature Superconducting (HTS) leads, powers the magnets. About 700 sensors and actuators are distributed along four Profibus DP and two Profibus PA field buses. The process automation is handled by two contro...

  13. An Automated Platform for Analysis of Phosphoproteomic Datasets: Application to Kidney Collecting Duct Phosphoproteins

    OpenAIRE

    Jason D. Hoffert; Wang, Guanghui; Pisitkun, Trairak; Shen, Rong-Fong; Knepper, Mark A.

    2007-01-01

    Large-scale phosphoproteomic analysis employing liquid chromatography-tandem mass spectrometry (LC–MS/MS) often requires a significant amount of manual manipulation of phosphopeptide datasets in the postacquisition phase. To assist in this process, we have created software, PhosphoPIC (PhosphoPeptide Identification and Compilation), which can perform a variety of useful functions including automated selection and compilation of phosphopeptide identifications from multiple MS levels, estimatio...

  14. Automated analysis of digital oximetry in the diagnosis of obstructive sleep apnoea

    OpenAIRE

    Vazquez, J.; Tsai, W; Flemons, W; Masuda, A; Brant, R.; Hajduk, E.; Whitelaw, W.; J. Remmers

    2000-01-01

    BACKGROUND—The gold standard diagnostic test for obstructive sleep apnoea (OSA) is overnight polysomnography (PSG) which is costly in terms of time and money. Consequently, a number of alternatives to PSG have been proposed. Oximetry is appealing because of its widespread availability and ease of application. The diagnostic performance of an automated analysis algorithm based on falls and recovery of digitally recorded oxygen saturation was compared with PSG.
METHODS—Two ...

  15. Pharmacokinetic analysis of topical tobramycin in equine tears by automated immunoassay

    OpenAIRE

    Czerwinski Sarah L; Lyon Andrew W; Skorobohach Brian; Léguillette Renaud

    2012-01-01

    Abstract Background Ophthalmic antibiotic therapy in large animals is often used empirically because of the lack of pharmacokinetics studies. The purpose of the study was to determine the pharmacokinetics of topical tobramycin 0.3% ophthalmic solution in the tears of normal horses using an automated immunoassay analysis. Results The mean tobramycin concentrations in the tears at 5, 10, 15, 30 minutes and 1, 2, 4, 6 hours after administration were 759 (±414), 489 (±237), 346 (±227), 147 (±264)...

  16. Automating HIV Drug Resistance Genotyping with RECall, a Freely Accessible Sequence Analysis Tool

    OpenAIRE

    Woods, Conan K.; Chanson J Brumme; Liu, Tommy F; Chui, Celia K. S.; Chu, Anna L.; Wynhoven, Brian; Hall, Tom A.; Trevino, Christina; Shafer, Robert W; Harrigan, P. Richard

    2012-01-01

    Genotypic HIV drug resistance testing is routinely used to guide clinical decisions. While genotyping methods can be standardized, a slow, labor-intensive, and subjective manual sequence interpretation step is required. We therefore performed external validation of our custom software RECall, a fully automated sequence analysis pipeline. HIV-1 drug resistance genotyping was performed on 981 clinical samples at the Stanford Diagnostic Virology Laboratory. Sequencing trace files were first inte...

  17. Characterization of the microstructure of dairy systems using automated image analysis

    OpenAIRE

    Silva, Juliana V.C.; Legland, David; Cauty, Chantal; Kolotuev, Irina; Floury, Juliane

    2015-01-01

    A sound understanding of the microstructure of dairy products is of great importance in order to predict and control their properties and final quality. The aim of this study was to develop an automated image analysis procedure to characterize the microstructure of different dairy systems. A high pressure freezing coupled with freeze-substitution (HPF-FS) protocol was applied prior to transmission electron microscopy(TEM) in order to minimize any modification of the microstructure of the dair...

  18. Combined eukaryotic and bacterial community fingerprinting of natural freshwater biofilms using automated ribosomal intergenic spacer analysis

    OpenAIRE

    2010-01-01

    Biofilms are complex communities playing an important role in aquatic ecosystems. Automated ribosomal intergenic spacer analysis (ARISA) has been used successfully to explore biofilm bacterial diversity. However, a gap remains to be filled as regards its application to biofilm eukaryotic populations. The aim of this study is to use ARISA to detect eukaryotic population shifts in biofilm. We designed a new set of primers to focus specifically on the ITS1-5.8S-ITS2 region of diatoms and tested ...

  19. An automated system for liquid-liquid extraction in monosegmented flow analysis

    OpenAIRE

    Facchin, Ileana; Jarbas J. R. Rohwedder; Pasquini, Celio

    1997-01-01

    An automated system to perform liquid-liquid extraction in monosegmented flow analysis is described. The system is controlled by a microcomputer that can track the localization of the aqueous monosegmented sample in the manifold. Optical switches are employed to sense the gas-liquid interface of the air bubbles that define the monosegment. The logical level changes, generated by the switches, are flagged by the computer through a home-made interface that also contains the analogue-to-digital ...

  20. SCHUBOT: Machine Learning Tools for the Automated Analysis of Schubert’s Lieder

    OpenAIRE

    Nagler, Dylan Jeremy

    2014-01-01

    This paper compares various methods for automated musical analysis, applying machine learning techniques to gain insight about the Lieder (art songs) of com- poser Franz Schubert (1797-1828). Known as a rule-breaking, individualistic, and adventurous composer, Schubert produced hundreds of emotionally-charged songs that have challenged music theorists to this day. The algorithms presented in this paper analyze the harmonies, melodies, and texts of these songs. This paper begins with an explor...

  1. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    OpenAIRE

    Demir Sumeyra U; Hakimzadeh Roya; Hargraves Rosalyn Hobson; Ward Kevin R; Myer Eric V; Najarian Kayvan

    2012-01-01

    Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tis...

  2. The impact of air pollution on the level of micronuclei measured by automated image analysis

    Czech Academy of Sciences Publication Activity Database

    Rössnerová, Andrea; Špátová, Milada; Rossner, P.; Solanský, I.; Šrám, Radim

    2009-01-01

    Roč. 669, 1-2 (2009), s. 42-47. ISSN 0027-5107 R&D Projects: GA AV ČR 1QS500390506; GA MŠk 2B06088; GA MŠk 2B08005 Institutional research plan: CEZ:AV0Z50390512 Keywords : micronuclei * binucleated cells * automated image analysis Subject RIV: DN - Health Impact of the Environment Quality Impact factor: 3.556, year: 2009

  3. A Multi-Wavelength Analysis of Active Regions and Sunspots by Comparison of Automated Detection Algorithms

    OpenAIRE

    Verbeeck, Cis; Higgins, Paul A.; Colak, Tufan; Watson, Fraser T.; Delouille, Veronique; Mampaey, Benjamin; Qahwaji, Rami

    2011-01-01

    Since the Solar Dynamics Observatory (SDO) began recording ~ 1 TB of data per day, there has been an increased need to automatically extract features and events for further analysis. Here we compare the overall detection performance, correlations between extracted properties, and usability for feature tracking of four solar feature-detection algorithms: the Solar Monitor Active Region Tracker (SMART) detects active regions in line-of-sight magnetograms; the Automated Solar Activity Prediction...

  4. Automated economic analysis model for hazardous waste minimization

    International Nuclear Information System (INIS)

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  5. Mixed Methods Analysis of Enterprise Social Networks

    DEFF Research Database (Denmark)

    Behrendt, Sebastian; Richter, Alexander; Trier, Matthias

    2014-01-01

    The increasing use of enterprise social networks (ESN) generates vast amounts of data, giving researchers and managerial decision makers unprecedented opportunities for analysis. However, more transparency about the available data dimensions and how these can be combined is needed to yield accurate...

  6. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  7. INVESTIGATION OF NEURAL NETWORK ALGORITHM FOR DETECTION OF NETWORK HOST ANOMALIES IN THE AUTOMATED SEARCH FOR XSS VULNERABILITIES AND SQL INJECTIONS

    Directory of Open Access Journals (Sweden)

    Y. D. Shabalin

    2016-03-01

    Full Text Available A problem of aberrant behavior detection for network communicating computer is discussed. A novel approach based on dynamic response of computer is introduced. The computer is suggested as a multiple-input multiple-output (MIMO plant. To characterize dynamic response of the computer on incoming requests a correlation between input data rate and observed output response (outgoing data rate and performance metrics is used. To distinguish normal and aberrant behavior of the computer one-class neural network classifieris used. General idea of the algorithm is shortly described. Configuration of network testbed for experiments with real attacks and their detection is presented (the automated search for XSS and SQL injections. Real found-XSS and SQL injection attack software was used to model the intrusion scenario. It would be expectable that aberrant behavior of the server will reveal itself by some instantaneous correlation response which will be significantly different from any of normal ones. It is evident that correlation picture of attacks from different malware running, the site homepage overriding on the server (so called defacing, hardware and software failures will differ from correlation picture of normal functioning. Intrusion detection algorithm is investigated to estimate false positive and false negative rates in relation to algorithm parameters. The importance of correlation width value and threshold value selection was emphasized. False positive rate was estimated along the time series of experimental data. Some ideas about enhancement of the algorithm quality and robustness were mentioned.

  8. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  9. Diversity Performance Analysis on Multiple HAP Networks.

    Science.gov (United States)

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  10. Diversity Performance Analysis on Multiple HAP Networks

    Directory of Open Access Journals (Sweden)

    Feihong Dong

    2015-06-01

    Full Text Available One of the main design challenges in wireless sensor networks (WSNs is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV. In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF and cumulative distribution function (CDF of the received signal-to-noise ratio (SNR are derived. In addition, the average symbol error rate (ASER with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  11. An overview of the first decade of PollyNET: an emerging network of automated Raman-polarization lidars for continuous aerosol profiling

    Science.gov (United States)

    Baars, Holger; Kanitz, Thomas; Engelmann, Ronny; Althausen, Dietrich; Heese, Birgit; Komppula, Mika; Preißler, Jana; Tesche, Matthias; Ansmann, Albert; Wandinger, Ulla; Lim, Jae-Hyun; Ahn, Joon Young; Stachlewska, Iwona S.; Amiridis, Vassilis; Marinou, Eleni; Seifert, Patric; Hofer, Julian; Skupin, Annett; Schneider, Florian; Bohlmann, Stephanie; Foth, Andreas; Bley, Sebastian; Pfüller, Anne; Giannakaki, Eleni; Lihavainen, Heikki; Viisanen, Yrjö; Hooda, Rakesh Kumar; Nepomuceno Pereira, Sérgio; Bortoli, Daniele; Wagner, Frank; Mattis, Ina; Janicka, Lucja; Markowicz, Krzysztof M.; Achtert, Peggy; Artaxo, Paulo; Pauliquevis, Theotonio; Souza, Rodrigo A. F.; Prakesh Sharma, Ved; Gideon van Zyl, Pieter; Beukes, Johan Paul; Sun, Junying; Rohwer, Erich G.; Deng, Ruru; Mamouri, Rodanthi-Elisavet; Zamorano, Felix

    2016-04-01

    A global vertically resolved aerosol data set covering more than 10 years of observations at more than 20 measurement sites distributed from 63° N to 52° S and 72° W to 124° E has been achieved within the Raman and polarization lidar network PollyNET. This network consists of portable, remote-controlled multiwavelength-polarization-Raman lidars (Polly) for automated and continuous 24/7 observations of clouds and aerosols. PollyNET is an independent, voluntary, and scientific network. All Polly lidars feature a standardized instrument design with different capabilities ranging from single wavelength to multiwavelength systems, and now apply unified calibration, quality control, and data analysis. The observations are processed in near-real time without manual intervention, and are presented online at http://polly.tropos.de/. The paper gives an overview of the observations on four continents and two research vessels obtained with eight Polly systems. The specific aerosol types at these locations (mineral dust, smoke, dust-smoke and other dusty mixtures, urban haze, and volcanic ash) are identified by their Ångström exponent, lidar ratio, and depolarization ratio. The vertical aerosol distribution at the PollyNET locations is discussed on the basis of more than 55 000 automatically retrieved 30 min particle backscatter coefficient profiles at 532 nm as this operating wavelength is available for all Polly lidar systems. A seasonal analysis of measurements at selected sites revealed typical and extraordinary aerosol conditions as well as seasonal differences. These studies show the potential of PollyNET to support the establishment of a global aerosol climatology that covers the entire troposphere.

  12. Automated analysis technique developed for detection of ODSCC on the tubes of OPR1000 steam generator

    International Nuclear Information System (INIS)

    A steam generator (SG) tube is an important component of a nuclear power plant (NPP). It works as a pressure boundary between the primary and secondary systems. The integrity of a SG tube can be assessed by an eddy current test every outage. The eddy current technique(adopting a bobbin probe) is currently the main technique used to assess the integrity of the tubing of a steam generator. An eddy current signal analyst for steam generator tubes continuously analyzes data over a given period of time. However, there are possibilities that the analyst conducting the test may get tired and cause mistakes, such as: missing indications or not being able to separate a true defect signal from one that is more complicated. This error could lead to confusion and an improper interpretation of the signal analysis. In order to avoid these possibilities, many countries of opted for automated analyses. Axial ODSCC (outside diameter stress corrosion cracking) defects on the tubes of OPR1000 steam generators have been found on the tube that are in contract with tube support plates. In this study, automated analysis software called CDS (computer data screening) made by Zetec was used. This paper will discuss the results of introducing an automated analysis system for an axial ODSCC on the tubes of an OPR1000 steam generator.

  13. Who On Earth Is "Mr. Cypher": Automated Friend Injection Attacks on Social Networking Sites

    OpenAIRE

    Huber, Markus; Mulazzani, Martin; Weippl, Edgar

    2010-01-01

    Within this paper we present our novel friend injection attack which exploits the fact that the great majority of social networking sites fail to protect the communication between its users and their services. In a practical evaluation, on the basis of public wireless access points, we furthermore demonstrate the feasibility of our attack. The friend injection attack enables a stealth infiltration of social networks and thus outlines the devastating consequences of active eavesdropping attack...

  14. Analysis of SSR Using Artificial Neural Networks

    OpenAIRE

    Nagabhushana, BS; Chandrasekharaiah, HS

    1996-01-01

    Artificial neural networks (ANNs) are being advantageously applied to power system analysis problems. They possess the ability to establish complicated input-output mappings through a learning process, without any explicit programming. In this paper, an ANN based method for subsynchronous resonance (SSR) analysis is presented. The designed ANN outputs a measure of the possibility of the occurrence of SSR and is fully trained to accommodate the variations of power system parameters over the en...

  15. Automated Fetal Heart Rate Analysis in Labor: Decelerations and Overshoots

    International Nuclear Information System (INIS)

    Electronic fetal heart rate (FHR) recording is a standard way of monitoring fetal health in labor. Decelerations and accelerations usually indicate fetal distress and normality respectively. But one type of acceleration may differ, namely an overshoot that may atypically reflect fetal stress. Here we describe a new method for detecting decelerations, accelerations and overshoots as part of a novel system for computerized FHR analysis (OxSyS). There was poor agreement between clinicians when identifying these FHR features visually, which precluded setting a gold standard of interpretation. We therefore introduced 'modified' Sensitivity (SE deg.) and 'modified' Positive Predictive Value (PPV deg.) as appropriate performance measures with which the algorithm was optimized. The relation between overshoots and fetal compromise in labor was studied in 15 cases and 15 controls. Overshoots showed promise as an indicator of fetal compromise. Unlike ordinary accelerations, overshoots cannot be considered to be reassuring features of fetal health.

  16. Tool for automated method design in activation analysis

    International Nuclear Information System (INIS)

    A computational approach to the optimization of the adjustable parameters of nuclear activation analysis has been developed for use in comprehensive method design calculations. An estimate of sample composition is used to predict the gamma-ray spectra to be expected for given sets of values of experimental parameters. These spectra are used to evaluate responses such as detection limits and measurement precision for application to optimization by the simplex method. This technique has been successfully implemented for the simultaneous determination of sample size and irradiation, decay and counting times by the optimization of either detection limit or precision. Both single-element and multielement determinations can be designed with the aid of these calculations. The combination of advance prediction and simplex optimization is both flexible and efficient and produces numerical results suitable for use in further computations

  17. Enhancing the Authentication of Bank Cheque Signatures by Implementing Automated System Using Recurrent Neural Network

    CERN Document Server

    Rao, Mukta; Dhaka, Vijaypal Singh

    2010-01-01

    The associatie memory feature of the Hopfield type recurrent neural network is used for the pattern storage and pattern authentication.This paper outlines an optimization relaxation approach for signature verification based on the Hopfield neural network (HNN)which is a recurrent network.The standard sample signature of the customer is cross matched with the one supplied on the Cheque.The difference percentage is obtained by calculating the different pixels in both the images.The network topology is built so that each pixel in the difference image is a neuron in the network.Each neuron is categorized by its states,which in turn signifies that if the particular pixel is changed.The network converges to unwavering condition based on the energy function which is derived in experiments.The Hopfield's model allows each node to take on two binary state values (changed/unchanged)for each pixel.The performance of the proposed technique is evaluated by applying it in various binary and gray scale images.This paper con...

  18. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  19. A new web-based method for automated analysis of muscle histology

    Directory of Open Access Journals (Sweden)

    Pertl Cordula

    2013-01-01

    Full Text Available Abstract Background Duchenne Muscular Dystrophy is an inherited degenerative neuromuscular disease characterised by rapidly progressive muscle weakness. Currently, curative treatment is not available. Approaches for new treatments that improve muscle strength and quality of life depend on preclinical testing in animal models. The mdx mouse model is the most frequently used animal model for preclinical studies in muscular dystrophy research. Standardised pathology-relevant parameters of dystrophic muscle in mdx mice for histological analysis have been developed in international, collaborative efforts, but automation has not been accessible to most research groups. A standardised and mainly automated quantitative assessment of histopathological parameters in the mdx mouse model is desirable to allow an objective comparison between laboratories. Methods Immunological and histochemical reactions were used to obtain a double staining for fast and slow myosin. Additionally, fluorescence staining of the myofibre membranes allows defining the minimal Feret’s diameter. The staining of myonuclei with the fluorescence dye bisbenzimide H was utilised to identify nuclei located internally within myofibres. Relevant structures were extracted from the image as single objects and assigned to different object classes using web-based image analysis (MyoScan. Quantitative and morphometric data were analysed, e.g. the number of nuclei per fibre and minimal Feret’s diameter in 6 month old wild-type C57BL/10 mice and mdx mice. Results In the current version of the module “MyoScan”, essential parameters for histologic analysis of muscle sections were implemented including the minimal Feret’s diameter of the myofibres and the automated calculation of the percentage of internally nucleated myofibres. Morphometric data obtained in the present study were in good agreement with previously reported data in the literature and with data obtained from manual

  20. Automation of Extraction Chromatograhic and Ion Exchange Separations for Radiochemical Analysis and Monitoring

    International Nuclear Information System (INIS)

    Radiochemical analysis, complete with the separation of radionuclides of interest from the sample matrix and from other interfering radionuclides, is often an essential step in the determination of the radiochemical composition of a nuclear sample or process stream. Although some radionuclides can be determined nondestructively by gamma spectroscopy, where the gamma rays penetrate significant distances in condensed media and the gamma ray energies are diagnostic for specific radionuclides, other radionuclides that may be of interest emit only alpha or beta particles. For these, samples must be taken for destructive analysis and radiochemical separations are required. For process monitoring purposes, the radiochemical separation and detection methods must be rapid so that the results will be timely. These results could be obtained by laboratory analysis or by radiochemical process analyzers operating on-line or at-site. In either case, there is a need for automated radiochemical analysis methods to provide speed, throughput, safety, and consistent analytical protocols. Classical methods of separation used during the development of nuclear technologies, namely manual precipitations, solvent extractions, and ion exchange, are slow and labor intensive. Fortunately, the convergence of digital instrumentation for preprogrammed fluid manipulation and the development of new separation materials for column-based isolation of radionuclides has enabled the development of automated radiochemical analysis methodology. The primary means for separating radionuclides in solution are liquid-liquid extraction and ion exchange. These processes are well known and have been reviewed in the past.1 Ion exchange is readily employed in column formats. Liquid-liquid extraction can also be implemented on column formats using solvent-impregnated resins as extraction chromatographic materials. The organic liquid extractant is immobilized in the pores of a microporous polymer material. Under