WorldWideScience

Sample records for automated network analysis

  1. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  2. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  3. Artificial neural networks for automation of Rutherford backscattering spectroscopy experiments and data analysis

    International Nuclear Information System (INIS)

    Barradas, N.P.; Vieira, A.; Patricio, R.

    2002-01-01

    We present an algorithm based on artificial neural networks able to determine optimized experimental conditions for Rutherford backscattering measurements of Ge-implanted Si. The algorithm can be implemented for any other element implanted into a lighter substrate. It is foreseeable that the method developed in this work can be applied to still many other systems. The algorithm presented is a push-button black box, and does not require any human intervention. It is thus suited for automated control of an experimental setup, given an interface to the relevant hardware. Once the experimental conditions are optimized, the algorithm analyzes the final data obtained, and determines the desired parameters. The method is thus also suited for automated analysis of the data. The algorithm presented can be easily extended to other ion beam analysis techniques. Finally, it is suggested how the artificial neural networks required for automated control and analysis of experiments could be automatically generated. This would be suited for automated generation of the required computer code. Thus could RBS be done without experimentalists, data analysts, or programmers, with only technicians to keep the machines running

  4. Network-based automation for SMEs

    DEFF Research Database (Denmark)

    Parizi, Mohammad Shahabeddini; Radziwon, Agnieszka

    2017-01-01

    The implementation of appropriate automation concepts which increase productivity in Small and Medium Sized Enterprises (SMEs) requires a lot of effort, due to their limited resources. Therefore, it is strongly recommended for small firms to open up for the external sources of knowledge, which...... could be obtained through network interaction. Based on two extreme cases of SMEs representing low-tech industry and an in-depth analysis of their manufacturing facilities this paper presents how collaboration between firms embedded in a regional ecosystem could result in implementation of new...... with other members of the same regional ecosystem. The findings highlight two main automation related areas where manufacturing SMEs could leverage on external sources on knowledge – these are assistance in defining automation problem as well as appropriate solution and provider selection. Consequently...

  5. Logistic control in automated transportation networks

    NARCIS (Netherlands)

    Ebben, Mark

    2001-01-01

    Increasing congestion problems lead to a search for alternative transportation systems. Automated transportation networks, possibly underground, are an option. Logistic control systems are essential for future implementations of such automated transportation networks. This book contributes to the

  6. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2005-01-01

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many...

  7. Automated analysis of information processing, kinetic independence and modular architecture in biochemical networks using MIDIA.

    Science.gov (United States)

    Bowsher, Clive G

    2011-02-15

    Understanding the encoding and propagation of information by biochemical reaction networks and the relationship of such information processing properties to modular network structure is of fundamental importance in the study of cell signalling and regulation. However, a rigorous, automated approach for general biochemical networks has not been available, and high-throughput analysis has therefore been out of reach. Modularization Identification by Dynamic Independence Algorithms (MIDIA) is a user-friendly, extensible R package that performs automated analysis of how information is processed by biochemical networks. An important component is the algorithm's ability to identify exact network decompositions based on both the mass action kinetics and informational properties of the network. These modularizations are visualized using a tree structure from which important dynamic conditional independence properties can be directly read. Only partial stoichiometric information needs to be used as input to MIDIA, and neither simulations nor knowledge of rate parameters are required. When applied to a signalling network, for example, the method identifies the routes and species involved in the sequential propagation of information between its multiple inputs and outputs. These routes correspond to the relevant paths in the tree structure and may be further visualized using the Input-Output Path Matrix tool. MIDIA remains computationally feasible for the largest network reconstructions currently available and is straightforward to use with models written in Systems Biology Markup Language (SBML). The package is distributed under the GNU General Public License and is available, together with a link to browsable Supplementary Material, at http://code.google.com/p/midia. Further information is at www.maths.bris.ac.uk/~macgb/Software.html.

  8. Application of neural network and pattern recognition software to the automated analysis of continuous nuclear monitoring of on-load reactors

    Energy Technology Data Exchange (ETDEWEB)

    Howell, J.A.; Eccleston, G.W.; Halbig, J.K.; Klosterbuer, S.F. [Los Alamos National Lab., NM (United States); Larson, T.W. [California Polytechnic State Univ., San Luis Obispo, CA (US)

    1993-08-01

    Automated analysis using pattern recognition and neural network software can help interpret data, call attention to potential anomalies, and improve safeguards effectiveness. Automated software analysis, based on pattern recognition and neural networks, was applied to data collected from a radiation core discharge monitor system located adjacent to an on-load reactor core. Unattended radiation sensors continuously collect data to monitor on-line refueling operations in the reactor. The huge volume of data collected from a number of radiation channels makes it difficult for a safeguards inspector to review it all, check for consistency among the measurement channels, and find anomalies. Pattern recognition and neural network software can analyze large volumes of data from continuous, unattended measurements, thereby improving and automating the detection of anomalies. The authors developed a prototype pattern recognition program that determines the reactor power level and identifies the times when fuel bundles are pushed through the core during on-line refueling. Neural network models were also developed to predict fuel bundle burnup to calculate the region on the on-load reactor face from which fuel bundles were discharged based on the radiation signals. In the preliminary data set, which was limited and consisted of four distinct burnup regions, the neural network model correctly predicted the burnup region with an accuracy of 92%.

  9. Research of the self-healing technologies in the optical communication network of distribution automation

    Science.gov (United States)

    Wang, Hao; Zhong, Guoxin

    2018-03-01

    Optical communication network is the mainstream technique of the communication networks for distribution automation, and self-healing technologies can improve the in reliability of the optical communication networks significantly. This paper discussed the technical characteristics and application scenarios of several network self-healing technologies in the access layer, the backbone layer and the core layer of the optical communication networks for distribution automation. On the base of the contrastive analysis, this paper gives an application suggestion of these self-healing technologies.

  10. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Xiong, D

    2001-01-01

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  11. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging.

    Science.gov (United States)

    Patel, Tapan P; Man, Karen; Firestein, Bonnie L; Meaney, David F

    2015-03-30

    Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s-1000+neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. Copyright © 2015. Published by Elsevier B.V.

  12. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  13. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  14. Network meta-analysis using R: a review of currently available automated packages.

    Directory of Open Access Journals (Sweden)

    Binod Neupane

    Full Text Available Network meta-analysis (NMA--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i data input and network plotting, (ii modeling options, (iii assumption checking and diagnostic testing, and (iv inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  15. Content-driven analysis of an online community for smoking cessation: integration of qualitative techniques, automated text analysis, and affiliation networks.

    Science.gov (United States)

    Myneni, Sahiti; Fujimoto, Kayo; Cobb, Nathan; Cohen, Trevor

    2015-06-01

    We identified content-specific patterns of network diffusion underlying smoking cessation in the context of online platforms, with the aim of generating targeted intervention strategies. QuitNet is an online social network for smoking cessation. We analyzed 16 492 de-identified peer-to-peer messages from 1423 members, posted between March 1 and April 30, 2007. Our mixed-methods approach comprised qualitative coding, automated text analysis, and affiliation network analysis to identify, visualize, and analyze content-specific communication patterns underlying smoking behavior. Themes we identified in QuitNet messages included relapse, QuitNet-specific traditions, and cravings. QuitNet members who were exposed to other abstinent members by exchanging content related to interpersonal themes (e.g., social support, traditions, progress) tended to abstain. Themes found in other types of content did not show significant correlation with abstinence. Modeling health-related affiliation networks through content-driven methods can enable the identification of specific content related to higher abstinence rates, which facilitates targeted health promotion.

  16. Automated experimentation in ecological networks.

    Science.gov (United States)

    Lurgi, Miguel; Robertson, David

    2011-05-09

    In ecological networks, natural communities are studied from a complex systems perspective by representing interactions among species within them in the form of a graph, which is in turn analysed using mathematical tools. Topological features encountered in complex networks have been proved to provide the systems they represent with interesting attributes such as robustness and stability, which in ecological systems translates into the ability of communities to resist perturbations of different kinds. A focus of research in community ecology is on understanding the mechanisms by which these complex networks of interactions among species in a community arise. We employ an agent-based approach to model ecological processes operating at the species' interaction level for the study of the emergence of organisation in ecological networks. We have designed protocols of interaction among agents in a multi-agent system based on ecological processes occurring at the interaction level between species in plant-animal mutualistic communities. Interaction models for agents coordination thus engineered facilitate the emergence of network features such as those found in ecological networks of interacting species, in our artificial societies of agents. Agent based models developed in this way facilitate the automation of the design an execution of simulation experiments that allow for the exploration of diverse behavioural mechanisms believed to be responsible for community organisation in ecological communities. This automated way of conducting experiments empowers the study of ecological networks by exploiting the expressive power of interaction models specification in agent systems.

  17. Statistical modelling of networked human-automation performance using working memory capacity.

    Science.gov (United States)

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  18. Realtime Automation Networks in moVing industrial Environments

    Directory of Open Access Journals (Sweden)

    Rafael Leidinger

    2012-04-01

    Full Text Available The radio-based wireless data communication has made the realization of new technical solutions possible in many fields of the automation technology (AT. For about ten years, a constant disproportionate growth of wireless technologies can be observed in the automation technology. However, it shows that especially for the AT, conven-tional technologies of office automation are unsuitable and/or not manageable. The employment of mobile ser-vices in the industrial automation technology has the potential of significant cost and time savings. This leads to an increased productivity in various fields of the AT, for example in the factory and process automation or in production logistics. In this paper technologies and solu-tions for an automation-suited supply of mobile wireless services will be introduced under the criteria of real time suitability, IT-security and service orientation. Emphasis will be put on the investigation and develop-ment of wireless convergence layers for different radio technologies, on the central provision of support services for an easy-to-use, central, backup enabled management of combined wired / wireless networks and on the study on integrability in a Profinet real-time Ethernet network [1].

  19. Automated minimax design of networks

    DEFF Research Database (Denmark)

    Madsen, Kaj; Schjær-Jacobsen, Hans; Voldby, J

    1975-01-01

    A new gradient algorithm for the solution of nonlinear minimax problems has been developed. The algorithm is well suited for automated minimax design of networks and it is very simple to use. It compares favorably with recent minimax and leastpth algorithms. General convergence problems related...

  20. Technological Developments in Networking, Education and Automation

    CERN Document Server

    Elleithy, Khaled; Iskander, Magued; Kapila, Vikram; Karim, Mohammad A; Mahmood, Ausif

    2010-01-01

    "Technological Developments in Networking, Education and Automation" includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the following areas: Computer Networks: Access Technologies, Medium Access Control, Network architectures and Equipment, Optical Networks and Switching, Telecommunication Technology, and Ultra Wideband Communications. Engineering Education and Online Learning: including development of courses and systems for engineering, technical and liberal studies programs; online laboratories; intelligent

  1. A Systematic, Automated Network Planning Method

    DEFF Research Database (Denmark)

    Holm, Jens Åge; Pedersen, Jens Myrup

    2006-01-01

    This paper describes a case study conducted to evaluate the viability of a systematic, automated network planning method. The motivation for developing the network planning method was that many data networks are planned in an adhoc manner with no assurance of quality of the solution with respect...... structures, that are ready to implement in a real world scenario, are discussed in the end of the paper. These are in the area of ensuring line independence and complexity of the design rules for the planning method....

  2. Library Automation and Networking in India: Problems and Prospects.

    Science.gov (United States)

    Vyas, S. D.

    1997-01-01

    Examines the information infrastructure and the impact of information technology in India. Highlights include attempts toward automation; library networking at the national and local level; descriptions of four major networks; library software; and constraints of networking in academic libraries. (LRW)

  3. The automated ground network system

    Science.gov (United States)

    Smith, Miles T.; Militch, Peter N.

    1993-01-01

    The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.

  4. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  5. Operational experiences with automated acoustic burst classification by neural networks

    International Nuclear Information System (INIS)

    Olma, B.; Ding, Y.; Enders, R.

    1996-01-01

    Monitoring of Loose Parts Monitoring System sensors for signal bursts associated with metallic impacts of loose parts has proved as an useful methodology for on-line assessing the mechanical integrity of components in the primary circuit of nuclear power plants. With the availability of neural networks new powerful possibilities for classification and diagnosis of burst signals can be realized for acoustic monitoring with the online system RAMSES. In order to look for relevant burst signals an automated classification is needed, that means acoustic signature analysis and assessment has to be performed automatically on-line. A back propagation neural network based on five pre-calculated signal parameter values has been set up for identification of different signal types. During a three-month monitoring program of medium-operated check valves burst signals have been measured and classified separately according to their cause. The successful results of the three measurement campaigns with an automated burst type classification are presented. (author)

  6. Capacity analysis of an automated kit transportation system

    NARCIS (Netherlands)

    Zijm, W.H.M.; Adan, I.J.B.F.; Buitenhek, R.; Houtum, van G.J.J.A.N.

    2000-01-01

    In this paper, we present a capacity analysis of an automated transportation system in a flexible assembly factory. The transportation system, together with the workstations, is modeled as a network of queues with multiple job classes. Due to its complex nature, the steadystate behavior of this

  7. A fully-automated neural network analysis of AFM force-distance curves for cancer tissue diagnosis

    Science.gov (United States)

    Minelli, Eleonora; Ciasca, Gabriele; Sassun, Tanya Enny; Antonelli, Manila; Palmieri, Valentina; Papi, Massimiliano; Maulucci, Giuseppe; Santoro, Antonio; Giangaspero, Felice; Delfini, Roberto; Campi, Gaetano; De Spirito, Marco

    2017-10-01

    Atomic Force Microscopy (AFM) has the unique capability of probing the nanoscale mechanical properties of biological systems that affect and are affected by the occurrence of many pathologies, including cancer. This capability has triggered growing interest in the translational process of AFM from physics laboratories to clinical practice. A factor still hindering the current use of AFM in diagnostics is related to the complexity of AFM data analysis, which is time-consuming and needs highly specialized personnel with a strong physical and mathematical background. In this work, we demonstrate an operator-independent neural-network approach for the analysis of surgically removed brain cancer tissues. This approach allowed us to distinguish—in a fully automated fashion—cancer from healthy tissues with high accuracy, also highlighting the presence and the location of infiltrating tumor cells.

  8. Toward the automated generation of genome-scale metabolic networks in the SEED.

    Science.gov (United States)

    DeJongh, Matthew; Formsma, Kevin; Boillot, Paul; Gould, John; Rycenga, Matthew; Best, Aaron

    2007-04-26

    Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis). We have implemented our tools and database within the SEED, an open-source software environment for comparative genome annotation and analysis. Our method sets the

  9. Toward the automated generation of genome-scale metabolic networks in the SEED

    Directory of Open Access Journals (Sweden)

    Gould John

    2007-04-01

    Full Text Available Abstract Background Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. Results We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis. We have implemented our tools and database within the SEED, an open-source software environment for comparative

  10. Automated analysis of Physarum network structure and dynamics

    Science.gov (United States)

    Fricker, Mark D.; Akita, Dai; Heaton, Luke LM; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki

    2017-06-01

    We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015.

  11. Automated analysis of Physarum network structure and dynamics

    International Nuclear Information System (INIS)

    Fricker, Mark D; Heaton, Luke LM; Akita, Dai; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki

    2017-01-01

    We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015. (paper)

  12. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...... method. The method, however, does not fully automate the planning but make the planning process significantly fast. The results and discussion are presented and conclusion is given in the end....

  13. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    Nadinic, B.; Vanjak, Z.

    2004-01-01

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  14. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  15. User-friendly Establishment of Trust in Distributed Home Automation Networks

    DEFF Research Database (Denmark)

    Solberg Hjorth, Theis; Torbensen, Rune; Madsen, Per Printz

    2014-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup...... these relationships can lead to misconfiguration or breaches of security. We outline a security system for Home Automation called Trusted Domain that can establish and maintain cryptographically secure relationships between devices connected via IP-based networks and the Internet. Trust establishment is presented...

  16. SWOT Analysis of Automation for Cash and Accounts Control in Construction

    OpenAIRE

    Mariya Deriy

    2013-01-01

    The possibility has been analyzed as to computerization of control over accounting and information systems data in terms of cash and payments in company practical activity provided that the problem is solved of the existence of well-functioning single computer network between different units of a developing company. Current state of the control organization and possibility of its automation has been observed. SWOT analysis of control automation to identify its strengths and weaknesses, obstac...

  17. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  18. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Directory of Open Access Journals (Sweden)

    August Betzler

    2014-08-01

    Full Text Available Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  19. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-01-01

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network. PMID:25196004

  20. A holistic approach to ZigBee performance enhancement for home automation networks.

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-08-14

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  1. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  2. Automated Library Networking in American Public Community College Learning Resources Centers.

    Science.gov (United States)

    Miah, Adbul J.

    1994-01-01

    Discusses the need for community colleges to assess their participation in automated library networking systems (ALNs). Presents results of questionnaires sent to 253 community college learning resource center directors to determine their use of ALNs. Reviews benefits of automation and ALN activities, planning and communications, institution size,…

  3. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  4. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  5. Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies

    Science.gov (United States)

    Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.

    2016-02-01

    Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.

  6. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  7. Distributed microprocessor automation network for synthesizing radiotracers used in positron emission tomography

    International Nuclear Information System (INIS)

    Russell, J.A.G.; Alexoff, D.L.; Wolf, A.P.

    1984-01-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. 20 refs. (DT)

  8. A framework for automated service composition in collaborative networks

    NARCIS (Netherlands)

    Afsarmanesh, H.; Sargolzaei, M.; Shadi, M.

    2012-01-01

    This paper proposes a novel framework for automated software service composition that can significantly support and enhance collaboration among enterprises in service provision industry, such as in tourism insurance and e-commerce collaborative networks (CNs). Our proposed framework is founded on

  9. Problems and Prospects in Automation and Networking in Libraries in India

    OpenAIRE

    Pradip, Joshi; Nikose, S.M.

    2010-01-01

    This article presents Scenario of Automation and the networking of academic libraries are still in their formative stages. The reasons for, prerequisites of, and benefits of networking are given. Networking systems at the national and local levels are described, as are the salient features of INFLIBNET, which has been functioning since 1988. There are also three metropolitan networks, viz., DELNET, CALIBNET, and BONET. The libraries of the three metropolitan cities are already reaping the ben...

  10. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  11. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  12. Distributed Microprocessor Automation Network for Synthesizing Radiotracers Used in Positron Emission Tomography [PET

    Science.gov (United States)

    Russell, J. A. G.; Alexoff, D. L.; Wolf, A. P.

    1984-09-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. (DT)

  13. User-friendly establishment of trust in distributed home automation networks

    DEFF Research Database (Denmark)

    Hjorth, Theis Solberg; Madsen, Per Printz; Torbensen, Rune

    2012-01-01

    Current wireless technologies use a variety of methods to locally exchange and verify credentials between devices to establish trusted relationships. Scenarios in home automation networks also require this capability over the Internet, but the necessary involvement of non-expert users to setup...... these relationships can lead to misconfiguration or breaches of security. We outline a security system for Home Automation called Trusted Domain that can establish and maintain cryptographically secure relationships between devices connected via IP-based networks and the Internet. Trust establishment is presented...... of predefined pictograms. This method is designed to scale from smart-phones and tablets down to low-resource embedded systems. The presented approach is supported by an extensive literature study, and the ease of use and feasibility of the method has been indicated through a preliminary user study...

  14. Automated X-ray image analysis for cargo security: Critical review and future promise.

    Science.gov (United States)

    Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.

  15. Optimizing a Drone Network to Deliver Automated External Defibrillators.

    Science.gov (United States)

    Boutilier, Justin J; Brooks, Steven C; Janmohamed, Alyf; Byers, Adam; Buick, Jason E; Zhan, Cathy; Schoellig, Angela P; Cheskes, Sheldon; Morrison, Laurie J; Chan, Timothy C Y

    2017-06-20

    Public access defibrillation programs can improve survival after out-of-hospital cardiac arrest, but automated external defibrillators (AEDs) are rarely available for bystander use at the scene. Drones are an emerging technology that can deliver an AED to the scene of an out-of-hospital cardiac arrest for bystander use. We hypothesize that a drone network designed with the aid of a mathematical model combining both optimization and queuing can reduce the time to AED arrival. We applied our model to 53 702 out-of-hospital cardiac arrests that occurred in the 8 regions of the Toronto Regional RescuNET between January 1, 2006, and December 31, 2014. Our primary analysis quantified the drone network size required to deliver an AED 1, 2, or 3 minutes faster than historical median 911 response times for each region independently. A secondary analysis quantified the reduction in drone resources required if RescuNET was treated as a large coordinated region. The region-specific analysis determined that 81 bases and 100 drones would be required to deliver an AED ahead of median 911 response times by 3 minutes. In the most urban region, the 90th percentile of the AED arrival time was reduced by 6 minutes and 43 seconds relative to historical 911 response times in the region. In the most rural region, the 90th percentile was reduced by 10 minutes and 34 seconds. A single coordinated drone network across all regions required 39.5% fewer bases and 30.0% fewer drones to achieve similar AED delivery times. An optimized drone network designed with the aid of a novel mathematical model can substantially reduce the AED delivery time to an out-of-hospital cardiac arrest event. © 2017 American Heart Association, Inc.

  16. Automated road network extraction from high spatial resolution multi-spectral imagery

    Science.gov (United States)

    Zhang, Qiaoping

    For the last three decades, the Geomatics Engineering and Computer Science communities have considered automated road network extraction from remotely-sensed imagery to be a challenging and important research topic. The main objective of this research is to investigate the theory and methodology of automated feature extraction for image-based road database creation, refinement or updating, and to develop a series of algorithms for road network extraction from high resolution multi-spectral imagery. The proposed framework for road network extraction from multi-spectral imagery begins with an image segmentation using the k-means algorithm. This step mainly concerns the exploitation of the spectral information for feature extraction. The road cluster is automatically identified using a fuzzy classifier based on a set of predefined road surface membership functions. These membership functions are established based on the general spectral signature of road pavement materials and the corresponding normalized digital numbers on each multi-spectral band. Shape descriptors of the Angular Texture Signature are defined and used to reduce the misclassifications between roads and other spectrally similar objects (e.g., crop fields, parking lots, and buildings). An iterative and localized Radon transform is developed for the extraction of road centerlines from the classified images. The purpose of the transform is to accurately and completely detect the road centerlines. It is able to find short, long, and even curvilinear lines. The input image is partitioned into a set of subset images called road component images. An iterative Radon transform is locally applied to each road component image. At each iteration, road centerline segments are detected based on an accurate estimation of the line parameters and line widths. Three localization approaches are implemented and compared using qualitative and quantitative methods. Finally, the road centerline segments are grouped into a

  17. An automated approach to network features of protein structure ensembles

    Science.gov (United States)

    Bhattacharyya, Moitrayee; Bhat, Chanda R; Vishveshwara, Saraswathi

    2013-01-01

    Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of β2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html. PMID:23934896

  18. SONG-China Project: A Global Automated Observation Network

    Science.gov (United States)

    Yang, Z. Z.; Lu, X. M.; Tian, J. F.; Zhuang, C. G.; Wang, K.; Deng, L. C.

    2017-09-01

    Driven by advancements in technology and scientific objectives, data acquisition in observational astronomy has been changed greatly in recent years. Fully automated or even autonomous ground-based network of telescopes has now become a tendency for time-domain observational projects. The Stellar Observations Network Group (SONG) is an international collaboration with the participation and contribution of the Chinese astronomy community. The scientific goal of SONG is time-domain astrophysics such as asteroseismology and open cluster research. The SONG project aims to build a global network of 1 m telescopes equipped with high-precision and high-resolution spectrographs, and two-channel lucky-imaging cameras. It is the Chinese initiative to install a 50 cm binocular photometry telescope at each SONG node sharing the network platform and infrastructure. This work is focused on design and implementation in technology and methodology of SONG/50BiN, a typical ground-based network composed of multiple sites and a variety of instruments.

  19. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  20. Survey on Wireless Sensor Network Technologies for Industrial Automation: The Security and Quality of Service Perspectives

    Directory of Open Access Journals (Sweden)

    Delphine Christin

    2010-04-01

    Full Text Available Wireless Sensor Networks (WSNs are gradually adopted in the industrial world due to their advantages over wired networks. In addition to saving cabling costs, WSNs widen the realm of environments feasible for monitoring. They thus add sensing and acting capabilities to objects in the physical world and allow for communication among these objects or with services in the future Internet. However, the acceptance of WSNs by the industrial automation community is impeded by open issues, such as security guarantees and provision of Quality of Service (QoS. To examine both of these perspectives, we select and survey relevant WSN technologies dedicated to industrial automation. We determine QoS requirements and carry out a threat analysis, which act as basis of our evaluation of the current state-of-the-art. According to the results of this evaluation, we identify and discuss open research issues.

  1. Building Automation Systems Using Wireless Sensor Networks: Radio Characteristics and Energy Efficient Communication Protocols

    NARCIS (Netherlands)

    Shu, F.; Halgamuge, M.N.; Chen, W.

    2009-01-01

    Building automation systems (BAS) are typically used to monitor and control heating, ventilation, and air conditioning (HVAC) systems, manage building facilities (e.g., lighting, safety, and security), and automate meter reading. In recent years, the technology of wireless sensor network (WSN) has

  2. SensorScheme: Supply Chain Management Automation using Wireless Sensor Networks

    NARCIS (Netherlands)

    Evers, L.; Havinga, Paul J.M.; Kuper, Jan; Lijding, M.E.M.; Meratnia, Nirvana

    2007-01-01

    The supply chain management business can benefit greatly from automation, as recent developments with RFID technology shows. The use of Wireless Sensor Network technology promises to bring the next leap in efficiency and quality of service. However, current WSN system software does not yet provide

  3. A Study on Integrated Control Network for Multiple Automation Services-1st year report

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, D.H.; Park, B.S.; Kim, M.S.; Lim, Y.H.; Ahn, S.K. [Korea Electric Power Research Institute, Taejon (Korea)

    2002-07-01

    This report describes the development of Integrated and Intelligent Gateway which is under developed. The network operating technique in this report can identifies the causes of the communication faults and can avoid communication network faults in advance. Utility companies spend large financial investment and time for supplying the stabilized power. Since this is deeply related to the reliability of Automation Systems, it is natural to employ Fault-Tolerant communication network for Automation Systems. Use of the network system developed in this report is not limited in DAS. It can be expandable to the many kinds of data services for customer. Thus this report suggests the direction of the communication network development. This 1st year report is composed of following contents, 1) The introduction and problems of DAS. 2) The configuration and functions of IIG. 3) The protocols. (author). 27 refs., 73 figs., 6 tabs.

  4. Automated sensor networks to advance ocean science

    Science.gov (United States)

    Schofield, O.; Orcutt, J. A.; Arrott, M.; Vernon, F. L.; Peach, C. L.; Meisinger, M.; Krueger, I.; Kleinert, J.; Chao, Y.; Chien, S.; Thompson, D. R.; Chave, A. D.; Balasuriya, A.

    2010-12-01

    The National Science Foundation has funded the Ocean Observatories Initiative (OOI), which over the next five years will deploy infrastructure to expand scientist’s ability to remotely study the ocean. The deployed infrastructure will be linked by a robust cyberinfrastructure (CI) that will integrate marine observatories into a coherent system-of-systems. OOI is committed to engaging the ocean sciences community during the construction pahse. For the CI, this is being enabled by using a “spiral design strategy” allowing for input throughout the construction phase. In Fall 2009, the OOI CI development team used an existing ocean observing network in the Mid-Atlantic Bight (MAB) to test OOI CI software. The objective of this CI test was to aggregate data from ships, autonomous underwater vehicles (AUVs), shore-based radars, and satellites and make it available to five different data-assimilating ocean forecast models. Scientists used these multi-model forecasts to automate future glider missions in order to demonstrate the feasibility of two-way interactivity between the sensor web and predictive models. The CI software coordinated and prioritized the shared resources that allowed for the semi-automated reconfiguration of assett-tasking, and thus enabled an autonomous execution of observation plans for the fixed and mobile observation platforms. Efforts were coordinated through a web portal that provided an access point for the observational data and model forecasts. Researchers could use the CI software in tandem with the web data portal to assess the performance of individual numerical model results, or multi-model ensembles, through real-time comparisons with satellite, shore-based radar, and in situ robotic measurements. The resulting sensor net will enable a new means to explore and study the world’s oceans by providing scientists a responsive network in the world’s oceans that can be accessed via any wireless network.

  5. Tele command and network automation: strategy and results; Telecomando e automacao de redes: estrategia e resultados

    Energy Technology Data Exchange (ETDEWEB)

    Bargigia, Angelo; Cerreti, Alberto; Lembo, Giorgio di; Rogai, Sergio; Veglio, Gianfranco [Enel Distribuzione Spa, Rome (Italy)

    2004-02-01

    This article presents the adopted by the ENEL Distribuzione, Italy, for the tele command and automation in the distribution line. The article describes the medium term implementation program, based on the installation of remote terminal unities, with communication through cell phone GMS for transmission of collected data to the control centers of the network. A cost versus benefit analysis conducted and the obtained results are also evaluated.

  6. Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks

    Institute of Scientific and Technical Information of China (English)

    Ling Bai; Ping Guo; Zhan-Yi Hu

    2005-01-01

    An automated classification technique for large size stellar surveys is proposed. It uses the extended Kalman filter as a feature selector and pre-classifier of the data, and the radial basis function neural networks for the classification.Experiments with real data have shown that the correct classification rate can reach as high as 93%, which is quite satisfactory. When different system models are selected for the extended Kalman filter, the classification results are relatively stable. It is shown that for this particular case the result using extended Kalman filter is better than using principal component analysis.

  7. DESIGN OF BUILDING AUTOMATION BASED ON PROFIBUS-DP NETWORK

    Directory of Open Access Journals (Sweden)

    Cemal YILMAZ

    2006-02-01

    Full Text Available In this study, a building automation has been designed by using the Profibus DP (Process Field Bus- Decentralized Periphery network. In the study; fire alarm, thief alarm, lighting, power, humidity and temperature control have been implemented. The data from building has been transmitted to the Profibus-DP network via control point located on the flats. The data taken from the building has been collected in the main control unit to achieve overall control of the system. The work has provided an optimum efficiency in energy consumption, control of power, security, temperature and humidity.

  8. Artificial neural network-aided image analysis system for cell counting.

    Science.gov (United States)

    Sjöström, P J; Frydel, B R; Wahlberg, L U

    1999-05-01

    In histological preparations containing debris and synthetic materials, it is difficult to automate cell counting using standard image analysis tools, i.e., systems that rely on boundary contours, histogram thresholding, etc. In an attempt to mimic manual cell recognition, an automated cell counter was constructed using a combination of artificial intelligence and standard image analysis methods. Artificial neural network (ANN) methods were applied on digitized microscopy fields without pre-ANN feature extraction. A three-layer feed-forward network with extensive weight sharing in the first hidden layer was employed and trained on 1,830 examples using the error back-propagation algorithm on a Power Macintosh 7300/180 desktop computer. The optimal number of hidden neurons was determined and the trained system was validated by comparison with blinded human counts. System performance at 50x and lO0x magnification was evaluated. The correlation index at 100x magnification neared person-to-person variability, while 50x magnification was not useful. The system was approximately six times faster than an experienced human. ANN-based automated cell counting in noisy histological preparations is feasible. Consistent histology and computer power are crucial for system performance. The system provides several benefits, such as speed of analysis and consistency, and frees up personnel for other tasks.

  9. Automated Negotiation for Resource Assignment in Wireless Surveillance Sensor Networks

    Directory of Open Access Journals (Sweden)

    Enrique de la Hoz

    2015-11-01

    Full Text Available Due to the low cost of CMOS IP-based cameras, wireless surveillance sensor networks have emerged as a new application of sensor networks able to monitor public or private areas or even country borders. Since these networks are bandwidth intensive and the radioelectric spectrum is limited, especially in unlicensed bands, it is mandatory to assign frequency channels in a smart manner. In this work, we propose the application of automated negotiation techniques for frequency assignment. Results show that these techniques are very suitable for the problem, being able to obtain the best solutions among the techniques with which we have compared them.

  10. The optimal number, type and location of devices in automation of electrical distribution networks

    Directory of Open Access Journals (Sweden)

    Popović Željko N.

    2015-01-01

    Full Text Available This paper presents the mixed integer linear programming based model for determining optimal number, type and location of remotely controlled and supervised devices in distribution networks in the presence of distributed generators. The proposed model takes into consideration a number of different devices simultaneously (remotely controlled circuit breakers/reclosers, sectionalizing switches, remotely supervised and local fault passage indicators along with the following: expected outage cost to consumers and producers due to momentary and long-term interruptions, automated device expenses (capital investment, installation, and annual operation and maintenance costs, number and expenses of crews involved in the isolation and restoration process. Furthermore, the other possible benefits of each of automated device are also taken into account (e.g., benefits due to decreasing the cost of switching operations in normal conditions. The obtained numerical results emphasize the importance of consideration of different types of automation devices simultaneously. They also show that the proposed approach have a potential to improve the process of determining of the best automation strategy in real life distribution networks.

  11. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  12. NAP: The Network Analysis Profiler, a web tool for easier topological analysis and comparison of medium-scale biological networks.

    Science.gov (United States)

    Theodosiou, Theodosios; Efstathiou, Georgios; Papanikolaou, Nikolas; Kyrpides, Nikos C; Bagos, Pantelis G; Iliopoulos, Ioannis; Pavlopoulos, Georgios A

    2017-07-14

    Nowadays, due to the technological advances of high-throughput techniques, Systems Biology has seen a tremendous growth of data generation. With network analysis, looking at biological systems at a higher level in order to better understand a system, its topology and the relationships between its components is of a great importance. Gene expression, signal transduction, protein/chemical interactions, biomedical literature co-occurrences, are few of the examples captured in biological network representations where nodes represent certain bioentities and edges represent the connections between them. Today, many tools for network visualization and analysis are available. Nevertheless, most of them are standalone applications that often (i) burden users with computing and calculation time depending on the network's size and (ii) focus on handling, editing and exploring a network interactively. While such functionality is of great importance, limited efforts have been made towards the comparison of the topological analysis of multiple networks. Network Analysis Provider (NAP) is a comprehensive web tool to automate network profiling and intra/inter-network topology comparison. It is designed to bridge the gap between network analysis, statistics, graph theory and partially visualization in a user-friendly way. It is freely available and aims to become a very appealing tool for the broader community. It hosts a great plethora of topological analysis methods such as node and edge rankings. Few of its powerful characteristics are: its ability to enable easy profile comparisons across multiple networks, find their intersection and provide users with simplified, high quality plots of any of the offered topological characteristics against any other within the same network. It is written in R and Shiny, it is based on the igraph library and it is able to handle medium-scale weighted/unweighted, directed/undirected and bipartite graphs. NAP is available at http://bioinformatics.med.uoc.gr/NAP .

  13. Vibration analysis in nuclear power plant using neural networks

    International Nuclear Information System (INIS)

    Loskiewicz-Buczak, A.; Alguindigue, I.E.

    1993-01-01

    Vibration monitoring of components in nuclear power plants has been used for a number of years. This technique involves the analysis of vibration data coming from vital components of the plant to detect features which reflect the operational state of machinery. The analysis leads to the identification of potential failures and their causes, and makes it possible to perform efficient preventive maintenance. This paper documents the authors' work on the design of a vibration monitoring methodology enhanced by neural network technology. This technology provides an attractive complement to traditional vibration analysis because of the potential of neural networks to handle data which may be distorted or noisy. This paper describes three neural networks-based methods for the automation of some of the activities related to motion and vibration monitoring in engineering systems

  14. Concept of a computer network architecture for complete automation of nuclear power plants

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ray, A.

    1990-01-01

    The state of the art in automation of nuclear power plants has been largely limited to computerized data acquisition, monitoring, display, and recording of process signals. Complete automation of nuclear power plants, which would include plant operations, control, and management, fault diagnosis, and system reconfiguration with efficient and reliable man/machine interactions, has been projected as a realistic goal. This paper presents the concept of a computer network architecture that would use a high-speed optical data highway to integrate diverse, interacting, and spatially distributed functions that are essential for a fully automated nuclear power plant

  15. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  16. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  17. Neural network expert system for X-ray analysis of welded joints

    Science.gov (United States)

    Kozlov, V. V.; Lapik, N. V.; Popova, N. V.

    2018-03-01

    The use of intelligent technologies for the automated analysis of product quality is one of the main trends in modern machine building. At the same time, rapid development in various spheres of human activity is experienced by methods associated with the use of artificial neural networks, as the basis for building automated intelligent diagnostic systems. Technologies of machine vision allow one to effectively detect the presence of certain regularities in the analyzed designation, including defects of welded joints according to radiography data.

  18. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  19. Anomaly detection in an automated safeguards system using neural networks

    International Nuclear Information System (INIS)

    Whiteson, R.; Howell, J.A.

    1992-01-01

    An automated safeguards system must be able to detect an anomalous event, identify the nature of the event, and recommend a corrective action. Neural networks represent a new way of thinking about basic computational mechanisms for intelligent information processing. In this paper, we discuss the issues involved in applying a neural network model to the first step of this process: anomaly detection in materials accounting systems. We extend our previous model to a 3-tank problem and compare different neural network architectures and algorithms. We evaluate the computational difficulties in training neural networks and explore how certain design principles affect the problems. The issues involved in building a neural network architecture include how the information flows, how the network is trained, how the neurons in a network are connected, how the neurons process information, and how the connections between neurons are modified. Our approach is based on the demonstrated ability of neural networks to model complex, nonlinear, real-time processes. By modeling the normal behavior of the processes, we can predict how a system should be behaving and, therefore, detect when an abnormality occurs

  20. Malware Classification Based on the Behavior Analysis and Back Propagation Neural Network

    Directory of Open Access Journals (Sweden)

    Pan Zhi-Peng

    2016-01-01

    Full Text Available With the development of the Internet, malwares have also been expanded on the network systems rapidly. In order to deal with the diversity and amount of the variants, a number of automated behavior analysis tools have emerged as the time requires. Yet these tools produce detailed behavior reports of the malwares, it still needs to specify its category and judge its criticality manually. In this paper, we propose an automated malware classification approach based on the behavior analysis. We firstly perform dynamic analyses to obtain the detailed behavior profiles of the malwares, which are then used to abstract the main features of the malwares and serve as the inputs of the Back Propagation (BP Neural Network model.The experimental results demonstrate that our classification technique is able to classify the malware variants effectively and detect malware accurately.

  1. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  2. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    Science.gov (United States)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  3. Home Network Technologies and Automating Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2009-12-01

    sophisticated energy consumers, it has been possible to improve the DR 'state of the art' with a manageable commitment of technical resources on both the utility and consumer side. Although numerous C & I DR applications of a DRAS infrastructure are still in either prototype or early production phases, these early attempts at automating DR have been notably successful for both utilities and C & I customers. Several factors have strongly contributed to this success and will be discussed below. These successes have motivated utilities and regulators to look closely at how DR programs can be expanded to encompass the remaining (roughly) half of the state's energy load - the light commercial and, in numerical terms, the more important residential customer market. This survey examines technical issues facing the implementation of automated DR in the residential environment. In particular, we will look at the potential role of home automation networks in implementing wide-scale DR systems that communicate directly to individual residences.

  4. Bluetooth Low Power Modes Applied to the Data Transportation Network in Home Automation Systems

    Directory of Open Access Journals (Sweden)

    Josu Etxaniz

    2017-04-01

    Full Text Available Even though home automation is a well-known research and development area, recent technological improvements in different areas such as context recognition, sensing, wireless communications or embedded systems have boosted wireless smart homes. This paper focuses on some of those areas related to home automation. The paper draws attention to wireless communications issues on embedded systems. Specifically, the paper discusses the multi-hop networking together with Bluetooth technology and latency, as a quality of service (QoS metric. Bluetooth is a worldwide standard that provides low power multi-hop networking. It is a radio license free technology and establishes point-to-point and point-to-multipoint links, known as piconets, or multi-hop networks, known as scatternets. This way, many Bluetooth nodes can be interconnected to deploy ambient intelligent networks. This paper introduces the research on multi-hop latency done with park and sniff low power modes of Bluetooth over the test platform developed. Besides, an empirical model is obtained to calculate the latency of Bluetooth multi-hop communications over asynchronous links when links in scatternets are always in sniff or the park mode. Smart home devices and networks designers would take advantage of the models and the estimation of the delay they provide in communications along Bluetooth multi-hop networks.

  5. Bluetooth Low Power Modes Applied to the Data Transportation Network in Home Automation Systems

    Science.gov (United States)

    Etxaniz, Josu; Aranguren, Gerardo

    2017-01-01

    Even though home automation is a well-known research and development area, recent technological improvements in different areas such as context recognition, sensing, wireless communications or embedded systems have boosted wireless smart homes. This paper focuses on some of those areas related to home automation. The paper draws attention to wireless communications issues on embedded systems. Specifically, the paper discusses the multi-hop networking together with Bluetooth technology and latency, as a quality of service (QoS) metric. Bluetooth is a worldwide standard that provides low power multi-hop networking. It is a radio license free technology and establishes point-to-point and point-to-multipoint links, known as piconets, or multi-hop networks, known as scatternets. This way, many Bluetooth nodes can be interconnected to deploy ambient intelligent networks. This paper introduces the research on multi-hop latency done with park and sniff low power modes of Bluetooth over the test platform developed. Besides, an empirical model is obtained to calculate the latency of Bluetooth multi-hop communications over asynchronous links when links in scatternets are always in sniff or the park mode. Smart home devices and networks designers would take advantage of the models and the estimation of the delay they provide in communications along Bluetooth multi-hop networks. PMID:28468294

  6. Bluetooth Low Power Modes Applied to the Data Transportation Network in Home Automation Systems.

    Science.gov (United States)

    Etxaniz, Josu; Aranguren, Gerardo

    2017-04-30

    Even though home automation is a well-known research and development area, recent technological improvements in different areas such as context recognition, sensing, wireless communications or embedded systems have boosted wireless smart homes. This paper focuses on some of those areas related to home automation. The paper draws attention to wireless communications issues on embedded systems. Specifically, the paper discusses the multi-hop networking together with Bluetooth technology and latency, as a quality of service (QoS) metric. Bluetooth is a worldwide standard that provides low power multi-hop networking. It is a radio license free technology and establishes point-to-point and point-to-multipoint links, known as piconets, or multi-hop networks, known as scatternets. This way, many Bluetooth nodes can be interconnected to deploy ambient intelligent networks. This paper introduces the research on multi-hop latency done with park and sniff low power modes of Bluetooth over the test platform developed. Besides, an empirical model is obtained to calculate the latency of Bluetooth multi-hop communications over asynchronous links when links in scatternets are always in sniff or the park mode. Smart home devices and networks designers would take advantage of the models and the estimation of the delay they provide in communications along Bluetooth multi-hop networks.

  7. Automation of seismic network signal interpolation: an artificial intelligence approach

    International Nuclear Information System (INIS)

    Chiaruttini, C.; Roberto, V.

    1988-01-01

    After discussing the current status of the automation in signal interpretation from seismic networks, a new approach, based on artificial-intelligence tecniques, is proposed. The knowledge of the human expert analyst is examined, with emphasis on its objects, strategies and reasoning techniques. It is argued that knowledge-based systems (or expert systems) provide the most appropriate tools for designing an automatic system, modelled on the expert behaviour

  8. Deep multi-scale location-aware 3D convolutional neural networks for automated detection of lacunes of presumed vascular origin

    Directory of Open Access Journals (Sweden)

    Mohsen Ghafoorian

    2017-01-01

    In this paper, we propose an automated two-stage method using deep convolutional neural networks (CNN. We show that this method has good performance and can considerably benefit readers. We first use a fully convolutional neural network to detect initial candidates. In the second step, we employ a 3D CNN as a false positive reduction tool. As the location information is important to the analysis of candidate structures, we further equip the network with contextual information using multi-scale analysis and integration of explicit location features. We trained, validated and tested our networks on a large dataset of 1075 cases obtained from two different studies. Subsequently, we conducted an observer study with four trained observers and compared our method with them using a free-response operating characteristic analysis. Shown on a test set of 111 cases, the resulting CAD system exhibits performance similar to the trained human observers and achieves a sensitivity of 0.974 with 0.13 false positives per slice. A feasibility study also showed that a trained human observer would considerably benefit once aided by the CAD system.

  9. Neural network approach in multichannel auditory event-related potential analysis.

    Science.gov (United States)

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  10. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.; Denton, M.M.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day

  11. Alternative approach to automated management of load flow in engineering networks considering functional reliability

    Directory of Open Access Journals (Sweden)

    Ирина Александровна Гавриленко

    2016-02-01

    Full Text Available The approach to automated management of load flow in engineering networks considering functional reliability was proposed in the article. The improvement of the concept of operational and strategic management of load flow in engineering networks was considered. The verbal statement of the problem for thesis research is defined, namely, the problem of development of information technology for exact calculation of the functional reliability of the network, or the risk of short delivery of purpose-oriented product for consumers

  12. Management issues in automated audit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Hochberg, J.G.; Wilhelmy, S.K.; McClary, J.F.; Christoph, G.G.

    1994-03-01

    This paper discusses management issues associated with the design and implementation of an automated audit analysis system that we use to detect security events. It gives the viewpoint of a team directly responsible for developing and managing such a system. We use Los Alamos National Laboratory`s Network Anomaly Detection and Intrusion Reporter (NADIR) as a case in point. We examine issues encountered at Los Alamos, detail our solutions to them, and where appropriate suggest general solutions. After providing an introduction to NADIR, we explore four general management issues: cost-benefit questions, privacy considerations, legal issues, and system integrity. Our experiences are of general interest both to security professionals and to anyone who may wish to implement a similar system. While NADIR investigates security events, the methods used and the management issues are potentially applicable to a broad range of complex systems. These include those used to audit credit card transactions, medical care payments, and procurement systems.

  13. Semi-automated tabulation of the 3D topology and morphology of branching networks using CT: application to the airway tree

    International Nuclear Information System (INIS)

    Sauret, V.; Bailey, A.G.

    1999-01-01

    Detailed information on biological branching networks (optical nerves, airways or blood vessels) is often required to improve the analysis of 3D medical imaging data. A semi-automated algorithm has been developed to obtain the full 3D topology and dimensions (direction cosine, length, diameter, branching and gravity angles) of branching networks using their CT images. It has been tested using CT images of a simple Perspex branching network and applied to the CT images of a human cast of the airway tree. The morphology and topology of the computer derived network were compared with the manually measured dimensions. Good agreement was found. The airways dimensions also compared well with previous values quoted in literature. This algorithm can provide complete data set analysis much more quickly than manual measurements. Its use is limited by the CT resolution which means that very small branches are not visible. New data are presented on the branching angles of the airway tree. (author)

  14. PollyNET - an emerging network of automated raman-polarizarion lidars for continuous aerosolprofiling

    Science.gov (United States)

    Baars, Holger; Althausen, Dietrich; Engelmann, Ronny; Heese, Birgit; Ansmann, Albert; Wandinger, Ulla; Hofer, Julian; Skupin, Annett; Komppula, Mika; Giannakaki, Eleni; Filioglou, Maria; Bortoli, Daniele; Silva, Ana Maria; Pereira, Sergio; Stachlewska, Iwona S.; Kumala, Wojciech; Szczepanik, Dominika; Amiridis, Vassilis; Marinou, Eleni; Kottas, Michail; Mattis, Ina; Müller, Gerhard

    2018-04-01

    PollyNET is a network of portable, automated, and continuously measuring Ramanpolarization lidars of type Polly operated by several institutes worldwide. The data from permanent and temporary measurements sites are automatically processed in terms of optical aerosol profiles and displayed in near-real time at polly.tropos.de. According to current schedules, the network will grow by 3-4 systems during the upcoming 2-3 years and will then comprise 11 permanent stations and 2 mobile platforms.

  15. Tri-Band PCB Antenna for Wireless Sensor Network Transceivers in Home Automation Applications

    DEFF Research Database (Denmark)

    Rohde, John; Toftegaard, Thomas Skjødeberg

    2012-01-01

    A novel tri-band antenna design for wireless sensor network devices in home automation applications is proposed. The design is based on a combination of a conventional monopole wire antenna and discrete distributed load impedances. The load impedances are employed to ensure the degrees of freedom...

  16. Automated diagnosis of rolling bearings using MRA and neural networks

    Science.gov (United States)

    Castejón, C.; Lara, O.; García-Prada, J. C.

    2010-01-01

    Any industry needs an efficient predictive plan in order to optimize the management of resources and improve the economy of the plant by reducing unnecessary costs and increasing the level of safety. A great percentage of breakdowns in productive processes are caused by bearings. They begin to deteriorate from early stages of their functional life, also called the incipient level. This manuscript develops an automated diagnosis of rolling bearings based on the analysis and classification of signature vibrations. The novelty of this work is the application of the methodology proposed for data collected from a quasi-real industrial machine, where rolling bearings support the radial and axial loads the bearings are designed for. Multiresolution analysis (MRA) is used in a first stage in order to extract the most interesting features from signals. Features will be used in a second stage as inputs of a supervised neural network (NN) for classification purposes. Experimental results carried out in a real system show the soundness of the method which detects four bearing conditions (normal, inner race fault, outer race fault and ball fault) in a very incipient stage.

  17. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    -of-the-art research on theoretical and practical aspects of automated analysis, verification, and synthesis. Among 74 research papers and 10 tool papers submitted to ATVA 2009, the Program Committee accepted 23 as regular papers and 3 as tool papers. In all, 33 experts from 17 countries worked hard to make sure......This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state...

  18. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.; Beugelsdijk, T.J.

    1992-01-01

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  19. Automation of the Analysis of Moessbauer Spectra

    International Nuclear Information System (INIS)

    Souza, Paulo A. de Jr.; Garg, R.; Garg, V. K.

    1998-01-01

    In the present report we propose the automation of least square fitting of Moessbauer spectra, the identification of the substance, its crystal structure and the access to the references with the help of a genetic algorith, Fuzzy logic, and the artificial neural network associated with a databank of Moessbauer parameters and references. This system could be useful for specialists and non-specialists, in industry as well as in research laboratories

  20. Future Computer, Communication, Control and Automation

    CERN Document Server

    2011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume topics covered include wireless communications, advances in wireless video, wireless sensors networking, security in wireless networks, network measurement and management, hybrid and discrete-event systems, internet analytics and automation, robotic system and applications, reconfigurable automation systems, machine vision in automation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  1. Automating dChip: toward reproducible sharing of microarray data analysis

    Directory of Open Access Journals (Sweden)

    Li Cheng

    2008-05-01

    Full Text Available Abstract Background During the past decade, many software packages have been developed for analysis and visualization of various types of microarrays. We have developed and maintained the widely used dChip as a microarray analysis software package accessible to both biologist and data analysts. However, challenges arise when dChip users want to analyze large number of arrays automatically and share data analysis procedures and parameters. Improvement is also needed when the dChip user support team tries to identify the causes of reported analysis errors or bugs from users. Results We report here implementation and application of the dChip automation module. Through this module, dChip automation files can be created to include menu steps, parameters, and data viewpoints to run automatically. A data-packaging function allows convenient transfer from one user to another of the dChip software, microarray data, and analysis procedures, so that the second user can reproduce the entire analysis session of the first user. An analysis report file can also be generated during an automated run, including analysis logs, user comments, and viewpoint screenshots. Conclusion The dChip automation module is a step toward reproducible research, and it can prompt a more convenient and reproducible mechanism for sharing microarray software, data, and analysis procedures and results. Automation data packages can also be used as publication supplements. Similar automation mechanisms could be valuable to the research community if implemented in other genomics and bioinformatics software packages.

  2. SPECIAL LIBRARIES OF FRAGMENTS OF ALGORITHMIC NETWORKS TO AUTOMATE THE DEVELOPMENT OF ALGORITHMIC MODELS

    Directory of Open Access Journals (Sweden)

    V. E. Marley

    2015-01-01

    Full Text Available Summary. The concept of algorithmic models appeared from the algorithmic approach in which the simulated object, the phenomenon appears in the form of process, subject to strict rules of the algorithm, which placed the process of operation of the facility. Under the algorithmic model is the formalized description of the scenario subject specialist for the simulated process, the structure of which is comparable with the structure of the causal and temporal relationships between events of the process being modeled, together with all information necessary for its software implementation. To represent the structure of algorithmic models used algorithmic network. Normally, they were defined as loaded finite directed graph, the vertices which are mapped to operators and arcs are variables, bound by operators. The language of algorithmic networks has great features, the algorithms that it can display indifference the class of all random algorithms. In existing systems, automation modeling based on algorithmic nets, mainly used by operators working with real numbers. Although this reduces their ability, but enough for modeling a wide class of problems related to economy, environment, transport, technical processes. The task of modeling the execution of schedules and network diagrams is relevant and useful. There are many counting systems, network graphs, however, the monitoring process based analysis of gaps and terms of graphs, no analysis of prediction execution schedule or schedules. The library is designed to build similar predictive models. Specifying source data to obtain a set of projections from which to choose one and take it for a new plan.

  3. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  4. Supporting Control Room Operators in Highly Automated Future Power Networks

    DEFF Research Database (Denmark)

    Chen, Minjiang; Catterson, Victoria; Syed, Mazheruddin

    2017-01-01

    Operating power systems is an extremely challenging task, not least because power systems have become highly interconnected, as well as the range of network issues that can occur. It is therefore a necessity to develop decision support systems and visualisation that can effectively support the hu...... the human operators for decisionmaking in the complex and dynamic environment of future highly automated power system. This paper aims to investigate the decision support functions associated with frequency deviation events for the proposed Web of Cells concept....

  5. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1981-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  6. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  7. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  8. Research of the application of the new communication technologies for distribution automation

    Science.gov (United States)

    Zhong, Guoxin; Wang, Hao

    2018-03-01

    Communication network is a key factor of distribution automation. In recent years, new communication technologies for distribution automation have a rapid development in China. This paper introduces the traditional communication technologies of distribution automation and analyse the defects of these traditional technologies. Then this paper gives a detailed analysis on some new communication technologies for distribution automation including wired communication and wireless communication and then gives an application suggestion of these new technologies.

  9. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  10. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Directory of Open Access Journals (Sweden)

    Jianfang Cao

    2015-01-01

    Full Text Available With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  11. Automated target recognition and tracking using an optical pattern recognition neural network

    Science.gov (United States)

    Chao, Tien-Hsin

    1991-01-01

    The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.

  12. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    Science.gov (United States)

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  13. Team performance in networked supervisory control of unmanned air vehicles: effects of automation, working memory, and communication content.

    Science.gov (United States)

    McKendrick, Ryan; Shaw, Tyler; de Visser, Ewart; Saqer, Haneen; Kidwell, Brian; Parasuraman, Raja

    2014-05-01

    Assess team performance within a net-worked supervisory control setting while manipulating automated decision aids and monitoring team communication and working memory ability. Networked systems such as multi-unmanned air vehicle (UAV) supervision have complex properties that make prediction of human-system performance difficult. Automated decision aid can provide valuable information to operators, individual abilities can limit or facilitate team performance, and team communication patterns can alter how effectively individuals work together. We hypothesized that reliable automation, higher working memory capacity, and increased communication rates of task-relevant information would offset performance decrements attributed to high task load. Two-person teams performed a simulated air defense task with two levels of task load and three levels of automated aid reliability. Teams communicated and received decision aid messages via chat window text messages. Task Load x Automation effects were significant across all performance measures. Reliable automation limited the decline in team performance with increasing task load. Average team spatial working memory was a stronger predictor than other measures of team working memory. Frequency of team rapport and enemy location communications positively related to team performance, and word count was negatively related to team performance. Reliable decision aiding mitigated team performance decline during increased task load during multi-UAV supervisory control. Team spatial working memory, communication of spatial information, and team rapport predicted team success. An automated decision aid can improve team performance under high task load. Assessment of spatial working memory and the communication of task-relevant information can help in operator and team selection in supervisory control systems.

  14. Assessment of Automated Data Analysis Application on VVER Steam Generator Tubing

    International Nuclear Information System (INIS)

    Picek, E.; Barilar, D.

    2006-01-01

    INETEC - Institute for Nuclear Technology has developed software package named EddyOne having an option of automated analysis of bobbin coil eddy current data. During its development and site use some features were noticed preventing the wide use automatic analysis on VVER SG data. This article discuss these specific problems as well evaluates possible solutions. With regards to current state of automated analysis technology an overview of advantaged and disadvantages of automated analysis on VVER SG is summarized as well.(author)

  15. An efficient automated parameter tuning framework for spiking neural networks.

    Science.gov (United States)

    Carlson, Kristofor D; Nageswaran, Jayram Moorkanikara; Dutt, Nikil; Krichmar, Jeffrey L

    2014-01-01

    As the desire for biologically realistic spiking neural networks (SNNs) increases, tuning the enormous number of open parameters in these models becomes a difficult challenge. SNNs have been used to successfully model complex neural circuits that explore various neural phenomena such as neural plasticity, vision systems, auditory systems, neural oscillations, and many other important topics of neural function. Additionally, SNNs are particularly well-adapted to run on neuromorphic hardware that will support biological brain-scale architectures. Although the inclusion of realistic plasticity equations, neural dynamics, and recurrent topologies has increased the descriptive power of SNNs, it has also made the task of tuning these biologically realistic SNNs difficult. To meet this challenge, we present an automated parameter tuning framework capable of tuning SNNs quickly and efficiently using evolutionary algorithms (EA) and inexpensive, readily accessible graphics processing units (GPUs). A sample SNN with 4104 neurons was tuned to give V1 simple cell-like tuning curve responses and produce self-organizing receptive fields (SORFs) when presented with a random sequence of counterphase sinusoidal grating stimuli. A performance analysis comparing the GPU-accelerated implementation to a single-threaded central processing unit (CPU) implementation was carried out and showed a speedup of 65× of the GPU implementation over the CPU implementation, or 0.35 h per generation for GPU vs. 23.5 h per generation for CPU. Additionally, the parameter value solutions found in the tuned SNN were studied and found to be stable and repeatable. The automated parameter tuning framework presented here will be of use to both the computational neuroscience and neuromorphic engineering communities, making the process of constructing and tuning large-scale SNNs much quicker and easier.

  16. Physical explosion analysis in heat exchanger network design

    Science.gov (United States)

    Pasha, M.; Zaini, D.; Shariff, A. M.

    2016-06-01

    The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.

  17. Automated analysis of slitless spectra. II. Quasars

    International Nuclear Information System (INIS)

    Edwards, G.; Beauchemin, M.; Borra, F.

    1988-01-01

    Automated software have been developed to process slitless spectra. The software, described in a previous paper, automatically separates stars from extended objects and quasars from stars. This paper describes the quasar search techniques and discusses the results. The performance of the software is compared and calibrated with a plate taken in a region of SA 57 that has been extensively surveyed by others using a variety of techniques: the proposed automated software performs very well. It is found that an eye search of the same plate is less complete than the automated search: surveys that rely on eye searches suffer from incompleteness at least from a magnitude brighter than the plate limit. It is shown how the complete automated analysis of a plate and computer simulations are used to calibrate and understand the characteristics of the present data. 20 references

  18. Undelivered electricity as an indicator of the effects of automation in the 10 kV network PD ED Belgrade

    Directory of Open Access Journals (Sweden)

    Vrcelj Nada

    2013-01-01

    Full Text Available The paper discusses the effects of automation in the 10 kV PD ED Belgrade valorized through undelivered electricity. In the paper it was observed the parts of the network for which it was possible to reconstruct the events of the past. Calculations undelivered electricity were carried out for the period prior to the implementation of recloser in remote control system and during the period of probation system. A significant reduction in the duration of the fault, and thus undelivered electricity in automated network, indicates an increase in the reliability level after the implementation in the system SCADA SN.

  19. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  1. Content-specific network analysis of peer-to-peer communication in an online community for smoking cessation.

    Science.gov (United States)

    Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor

    2016-01-01

    Analysis of user interactions in online communities could improve our understanding of health-related behaviors and inform the design of technological solutions that support behavior change. However, to achieve this we would need methods that provide granular perspective, yet are scalable. In this paper, we present a methodology for high-throughput semantic and network analysis of large social media datasets, combining semi-automated text categorization with social network analytics. We apply this method to derive content-specific network visualizations of 16,492 user interactions in an online community for smoking cessation. Performance of the categorization system was reasonable (average F-measure of 0.74, with system-rater reliability approaching rater-rater reliability). The resulting semantically specific network analysis of user interactions reveals content- and behavior-specific network topologies. Implications for socio-behavioral health and wellness platforms are also discussed.

  2. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  3. Location-Based Self-Adaptive Routing Algorithm for Wireless Sensor Networks in Home Automation

    Directory of Open Access Journals (Sweden)

    Hong SeungHo

    2011-01-01

    Full Text Available The use of wireless sensor networks in home automation (WSNHA is attractive due to their characteristics of self-organization, high sensing fidelity, low cost, and potential for rapid deployment. Although the AODVjr routing algorithm in IEEE 802.15.4/ZigBee and other routing algorithms have been designed for wireless sensor networks, not all are suitable for WSNHA. In this paper, we propose a location-based self-adaptive routing algorithm for WSNHA called WSNHA-LBAR. It confines route discovery flooding to a cylindrical request zone, which reduces the routing overhead and decreases broadcast storm problems in the MAC layer. It also automatically adjusts the size of the request zone using a self-adaptive algorithm based on Bayes' theorem. This makes WSNHA-LBAR more adaptable to the changes of the network state and easier to implement. Simulation results show improved network reliability as well as reduced routing overhead.

  4. Design of Networked Home Automation System Based on μCOS-II and AMAZON

    Directory of Open Access Journals (Sweden)

    Liu Jianfeng

    2015-01-01

    Full Text Available In recent years, with the popularity of computers and smart phones and the development of intelligent building in electronics industry, people’s requirement of living environment is gradually changing. The intelligent home furnishing building has become the new focus of people purchasing. And the networked home automation system which relies on the advanced network technology to connect with air conditioning, lighting, security, curtains, TV, water heater and other home furnishing systems into a local area network becomes a networked control system. μC /OS is a real-time operating system with the free open-source code, the compact structure and the preemptive real-time kernel. In this paper, the author focuses on the design of home furnishing total controller based on AMAZON multimedia processor and μC/OS-II real-time operating system, and achieves the remote access connection and control through the Ethernet.

  5. Evaluation of an automated karyotyping system for chromosome aberration analysis

    International Nuclear Information System (INIS)

    Prichard, H.M.

    1987-01-01

    Chromosome aberration analysis is a promising complement to conventional radiation dosimetry, particularly in the complex radiation fields encountered in the space environment. The capabilities of a recently developed automated karyotyping system were evaluated both to determine current capabilities and limitations and to suggest areas where future development should be emphasized. Cells exposed to radiometric chemicals and to photon and particulate radiation were evaluated by manual inspection and by automated karyotyping. It was demonstrated that the evaluated programs were appropriate for image digitization, storage, and transmission. However, automated and semi-automated scoring techniques must be advanced significantly if in-flight chromosome aberration analysis is to be practical. A degree of artificial intelligence may be necessary to realize this goal

  6. Automated spectral and timing analysis of AGNs

    Science.gov (United States)

    Munz, F.; Karas, V.; Guainazzi, M.

    2006-12-01

    % We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.

  7. Analysis Of Packets Delay In Wireless Data Networks

    Directory of Open Access Journals (Sweden)

    Krivchenkov Aleksandr

    2015-12-01

    Full Text Available The networks with wireless links for automation control applications traffic transmission when packets have small size and application payload is predictable are under consideration. Analytical model for packets delay on their propagation path through the network is proposed. Estimations for network architectures based on WiFi and Bluetooth wireless technologies are made. The specifications for physical layer 802.11 a/b/g/n and 802.15.1 are under consideration. Analytical and experimental results for delivered network bandwidth for different network architecture, traffic structure and wireless technologies were compared to validate that basic mechanisms are correctly taken into account in the model. It is shown that basic effects are taken into account and further accuracy “improvement” of the model will give not more than 5%. As a result that is important for automation control applications we have reliably received the lowest possible level for packets delay in one wireless link. For 802.11 it is of order of 0.2 ms, for 802.15.1 it is 1.25 ms and is true when application packet can be transferred by one data frame.

  8. The contaminant analysis automation robot implementation for the automated laboratory

    International Nuclear Information System (INIS)

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-01-01

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation

  9. Wireless Android Based Home Automation System

    Directory of Open Access Journals (Sweden)

    Muhammad Tanveer Riaz

    2017-01-01

    Full Text Available This manuscript presents a prototype and design implementation of an advance home automation system that uses Wi-Fi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server, which presents system core that manages and controls user’s home. Users and system administrator can locally (Local Area Network or remotely (internet manage and control the system. Second part is the hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of the available home automation system in the market, the proposed system is scalable that one server can manage many hardware interface modules as long as it exists within network coverage. System supports a wide range of home automation devices like appliances, power management components, and security components. The proposed system is better in terms of the flexibility and scalability than the commercially available home automation systems

  10. Calibration-measurement unit for the automation of vector network analyzer measurements

    Directory of Open Access Journals (Sweden)

    I. Rolfes

    2008-05-01

    Full Text Available With the availability of multi-port vector network analyzers, the need for automated, calibrated measurement facilities increases. In this contribution, a calibration-measurement unit is presented which realizes a repeatable automated calibration of the measurement setup as well as a user-friendly measurement of the device under test (DUT. In difference to commercially available calibration units, which are connected to the ports of the vector network analyzer preceding a measurement and which are then removed so that the DUT can be connected, the presented calibration-measurement unit is permanently connected to the ports of the VNA for the calibration as well as for the measurement of the DUT. This helps to simplify the calibrated measurement of complex scattering parameters. Moreover, a full integration of the calibration unit into the analyzer setup becomes possible. The calibration-measurement unit is based on a multiport switch setup of e.g. electromechanical relays. Under the assumption of symmetry of a switch, on the one hand the unit realizes the connection of calibration standards like one-port reflection standards and two-port through connections between different ports and on the other hand it enables the connection of the DUT. The calibration-measurement unit is applicable for two-port VNAs as well as for multiport VNAs. For the calibration of the unit, methods with completely known calibration standards like SOLT (short, open, load, through as well as self-calibration procedures like TMR or TLR can be applied.

  11. Automated Interpretation of Blood Culture Gram Stains by Use of a Deep Convolutional Neural Network.

    Science.gov (United States)

    Smith, Kenneth P; Kang, Anthony D; Kirby, James E

    2018-03-01

    Microscopic interpretation of stained smears is one of the most operator-dependent and time-intensive activities in the clinical microbiology laboratory. Here, we investigated application of an automated image acquisition and convolutional neural network (CNN)-based approach for automated Gram stain classification. Using an automated microscopy platform, uncoverslipped slides were scanned with a 40× dry objective, generating images of sufficient resolution for interpretation. We collected 25,488 images from positive blood culture Gram stains prepared during routine clinical workup. These images were used to generate 100,213 crops containing Gram-positive cocci in clusters, Gram-positive cocci in chains/pairs, Gram-negative rods, or background (no cells). These categories were targeted for proof-of-concept development as they are associated with the majority of bloodstream infections. Our CNN model achieved a classification accuracy of 94.9% on a test set of image crops. Receiver operating characteristic (ROC) curve analysis indicated a robust ability to differentiate between categories with an area under the curve of >0.98 for each. After training and validation, we applied the classification algorithm to new images collected from 189 whole slides without human intervention. Sensitivity and specificity were 98.4% and 75.0% for Gram-positive cocci in chains and pairs, 93.2% and 97.2% for Gram-positive cocci in clusters, and 96.3% and 98.1% for Gram-negative rods. Taken together, our data support a proof of concept for a fully automated classification methodology for blood-culture Gram stains. Importantly, the algorithm was highly adept at identifying image crops with organisms and could be used to present prescreened, classified crops to technologists to accelerate smear review. This concept could potentially be extended to all Gram stain interpretive activities in the clinical laboratory. Copyright © 2018 American Society for Microbiology.

  12. Using deep neural networks to augment NIF post-shot analysis

    Science.gov (United States)

    Humbird, Kelli; Peterson, Luc; McClarren, Ryan; Field, John; Gaffney, Jim; Kruse, Michael; Nora, Ryan; Spears, Brian

    2017-10-01

    Post-shot analysis of National Ignition Facility (NIF) experiments is the process of determining which simulation inputs yield results consistent with experimental observations. This analysis is typically accomplished by running suites of manually adjusted simulations, or Monte Carlo sampling surrogate models that approximate the response surfaces of the physics code. These approaches are expensive and often find simulations that match only a small subset of observables simultaneously. We demonstrate an alternative method for performing post-shot analysis using inverse models, which map directly from experimental observables to simulation inputs with quantified uncertainties. The models are created using a novel machine learning algorithm which automates the construction and initialization of deep neural networks to optimize predictive accuracy. We show how these neural networks, trained on large databases of post-shot simulations, can rigorously quantify the agreement between simulation and experiment. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  13. Automated implementation of rule-based expert systems with neural networks for time-critical applications

    Science.gov (United States)

    Ramamoorthy, P. A.; Huang, Song; Govind, Girish

    1991-01-01

    In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.

  14. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  15. The Spiral Discovery Network as an Automated General-Purpose Optimization Tool

    Directory of Open Access Journals (Sweden)

    Adam B. Csapo

    2018-01-01

    Full Text Available The Spiral Discovery Method (SDM was originally proposed as a cognitive artifact for dealing with black-box models that are dependent on multiple inputs with nonlinear and/or multiplicative interaction effects. Besides directly helping to identify functional patterns in such systems, SDM also simplifies their control through its characteristic spiral structure. In this paper, a neural network-based formulation of SDM is proposed together with a set of automatic update rules that makes it suitable for both semiautomated and automated forms of optimization. The behavior of the generalized SDM model, referred to as the Spiral Discovery Network (SDN, and its applicability to nondifferentiable nonconvex optimization problems are elucidated through simulation. Based on the simulation, the case is made that its applicability would be worth investigating in all areas where the default approach of gradient-based backpropagation is used today.

  16. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    Science.gov (United States)

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678

  17. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    Directory of Open Access Journals (Sweden)

    JangMook Kang

    2010-09-01

    Full Text Available In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  18. Automated construction of node software using attributes in a ubiquitous sensor network environment.

    Science.gov (United States)

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  19. Automated analysis of high-content microscopy data with deep learning.

    Science.gov (United States)

    Kraus, Oren Z; Grys, Ben T; Ba, Jimmy; Chong, Yolanda; Frey, Brendan J; Boone, Charles; Andrews, Brenda J

    2017-04-18

    Existing computational pipelines for quantitative analysis of high-content microscopy data rely on traditional machine learning approaches that fail to accurately classify more than a single dataset without substantial tuning and training, requiring extensive analysis. Here, we demonstrate that the application of deep learning to biological image data can overcome the pitfalls associated with conventional machine learning classifiers. Using a deep convolutional neural network (DeepLoc) to analyze yeast cell images, we show improved performance over traditional approaches in the automated classification of protein subcellular localization. We also demonstrate the ability of DeepLoc to classify highly divergent image sets, including images of pheromone-arrested cells with abnormal cellular morphology, as well as images generated in different genetic backgrounds and in different laboratories. We offer an open-source implementation that enables updating DeepLoc on new microscopy datasets. This study highlights deep learning as an important tool for the expedited analysis of high-content microscopy data. © 2017 The Authors. Published under the terms of the CC BY 4.0 license.

  20. Automating Trend Analysis for Spacecraft Constellations

    Science.gov (United States)

    Davis, George; Cooter, Miranda; Updike, Clark; Carey, Everett; Mackey, Jennifer; Rykowski, Timothy; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Spacecraft trend analysis is a vital mission operations function performed by satellite controllers and engineers, who perform detailed analyses of engineering telemetry data to diagnose subsystem faults and to detect trends that may potentially lead to degraded subsystem performance or failure in the future. It is this latter function that is of greatest importance, for careful trending can often predict or detect events that may lead to a spacecraft's entry into safe-hold. Early prediction and detection of such events could result in the avoidance of, or rapid return to service from, spacecraft safing, which not only results in reduced recovery costs but also in a higher overall level of service for the satellite system. Contemporary spacecraft trending activities are manually intensive and are primarily performed diagnostically after a fault occurs, rather than proactively to predict its occurrence. They also tend to rely on information systems and software that are oudated when compared to current technologies. When coupled with the fact that flight operations teams often have limited resources, proactive trending opportunities are limited, and detailed trend analysis is often reserved for critical responses to safe holds or other on-orbit events such as maneuvers. While the contemporary trend analysis approach has sufficed for current single-spacecraft operations, it will be unfeasible for NASA's planned and proposed space science constellations. Missions such as the Dynamics, Reconnection and Configuration Observatory (DRACO), for example, are planning to launch as many as 100 'nanospacecraft' to form a homogenous constellation. A simple extrapolation of resources and manpower based on single-spacecraft operations suggests that trending for such a large spacecraft fleet will be unmanageable, unwieldy, and cost-prohibitive. It is therefore imperative that an approach to automating the spacecraft trend analysis function be studied, developed, and applied to

  1. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  2. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  3. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  4. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Automated radial basis function neural network based image classification system for diabetic retinopathy detection in retinal images

    Science.gov (United States)

    Anitha, J.; Vijila, C. Kezi Selva; Hemanth, D. Jude

    2010-02-01

    Diabetic retinopathy (DR) is a chronic eye disease for which early detection is highly essential to avoid any fatal results. Image processing of retinal images emerge as a feasible tool for this early diagnosis. Digital image processing techniques involve image classification which is a significant technique to detect the abnormality in the eye. Various automated classification systems have been developed in the recent years but most of them lack high classification accuracy. Artificial neural networks are the widely preferred artificial intelligence technique since it yields superior results in terms of classification accuracy. In this work, Radial Basis function (RBF) neural network based bi-level classification system is proposed to differentiate abnormal DR Images and normal retinal images. The results are analyzed in terms of classification accuracy, sensitivity and specificity. A comparative analysis is performed with the results of the probabilistic classifier namely Bayesian classifier to show the superior nature of neural classifier. Experimental results show promising results for the neural classifier in terms of the performance measures.

  6. Automated collection and dissemination of ionospheric data from the digisonde network

    Directory of Open Access Journals (Sweden)

    B.W. Reinisch

    2004-01-01

    Full Text Available The growing demand for fast access to accurate ionospheric electron density profiles and ionospheric characteristics calls for efficient dissemination of data from the many ionosondes operating around the globe. The global digisonde network with over 70 stations takes advantage of the Internet to make many of these sounders remotely accessible for data transfer and control. Key elements of the digisonde system data management are the visualization and editing tool SAO Explorer, the digital ionogram database DIDBase, holding raw and derived digisonde data under an industrial-strength database management system, and the automated data request execution system ADRES.

  7. Automated Identification of Core Regulatory Genes in Human Gene Regulatory Networks.

    Directory of Open Access Journals (Sweden)

    Vipin Narang

    Full Text Available Human gene regulatory networks (GRN can be difficult to interpret due to a tangle of edges interconnecting thousands of genes. We constructed a general human GRN from extensive transcription factor and microRNA target data obtained from public databases. In a subnetwork of this GRN that is active during estrogen stimulation of MCF-7 breast cancer cells, we benchmarked automated algorithms for identifying core regulatory genes (transcription factors and microRNAs. Among these algorithms, we identified K-core decomposition, pagerank and betweenness centrality algorithms as the most effective for discovering core regulatory genes in the network evaluated based on previously known roles of these genes in MCF-7 biology as well as in their ability to explain the up or down expression status of up to 70% of the remaining genes. Finally, we validated the use of K-core algorithm for organizing the GRN in an easier to interpret layered hierarchy where more influential regulatory genes percolate towards the inner layers. The integrated human gene and miRNA network and software used in this study are provided as supplementary materials (S1 Data accompanying this manuscript.

  8. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  9. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    Science.gov (United States)

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  10. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  11. Cybersecurity and Network Forensics: Analysis of Malicious Traffic towards a Honeynet with Deep Packet Inspection

    Directory of Open Access Journals (Sweden)

    Gabriel Arquelau Pimenta Rodrigues

    2017-10-01

    Full Text Available Any network connected to the Internet is subject to cyber attacks. Strong security measures, forensic tools, and investigators contribute together to detect and mitigate those attacks, reducing the damages and enabling reestablishing the network to its normal operation, thus increasing the cybersecurity of the networked environment. This paper addresses the use of a forensic approach with Deep Packet Inspection to detect anomalies in the network traffic. As cyber attacks may occur on any layer of the TCP/IP networking model, Deep Packet Inspection is an effective way to reveal suspicious content in the headers or the payloads in any packet processing layer, excepting of course situations where the payload is encrypted. Although being efficient, this technique still faces big challenges. The contributions of this paper rely on the association of Deep Packet Inspection with forensics analysis to evaluate different attacks towards a Honeynet operating in a network laboratory at the University of Brasilia. In this perspective, this work could identify and map the content and behavior of attacks such as the Mirai botnet and brute-force attacks targeting various different network services. Obtained results demonstrate the behavior of automated attacks (such as worms and bots and non-automated attacks (brute-force conducted with different tools. The data collected and analyzed is then used to generate statistics of used usernames and passwords, IP and services distribution, among other elements. This paper also discusses the importance of network forensics and Chain of Custody procedures to conduct investigations and shows the effectiveness of the mentioned techniques in evaluating different attacks in networks.

  12. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Fiehn, Anne-Marie Kanstrup; Kristensson, Martin; Engel, Ulla

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...... slides stained with Van Gieson (VG). PATIENTS AND METHODS: A training set consisting of ten biopsies diagnosed as CC, CCi, and normal colon mucosa was used to develop the automated image analysis (VG app) to match the assessment by a pathologist. The study set consisted of biopsies from 75 patients...

  13. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  14. Automated bony region identification using artificial neural networks: reliability and validation measurements

    International Nuclear Information System (INIS)

    Gassman, Esther E.; Kallemeyn, Nicole A.; DeVries, Nicole A.; Shivanna, Kiran H.; Powell, Stephanie M.; Magnotta, Vincent A.; Ramme, Austin J.; Adams, Brian D.; Grosland, Nicole M.

    2008-01-01

    The objective was to develop tools for automating the identification of bony structures, to assess the reliability of this technique against manual raters, and to validate the resulting regions of interest against physical surface scans obtained from the same specimen. Artificial intelligence-based algorithms have been used for image segmentation, specifically artificial neural networks (ANNs). For this study, an ANN was created and trained to identify the phalanges of the human hand. The relative overlap between the ANN and a manual tracer was 0.87, 0.82, and 0.76, for the proximal, middle, and distal index phalanx bones respectively. Compared with the physical surface scans, the ANN-generated surface representations differed on average by 0.35 mm, 0.29 mm, and 0.40 mm for the proximal, middle, and distal phalanges respectively. Furthermore, the ANN proved to segment the structures in less than one-tenth of the time required by a manual rater. The ANN has proven to be a reliable and valid means of segmenting the phalanx bones from CT images. Employing automated methods such as the ANN for segmentation, eliminates the likelihood of rater drift and inter-rater variability. Automated methods also decrease the amount of time and manual effort required to extract the data of interest, thereby making the feasibility of patient-specific modeling a reality. (orig.)

  15. Automated bony region identification using artificial neural networks: reliability and validation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Gassman, Esther E.; Kallemeyn, Nicole A.; DeVries, Nicole A.; Shivanna, Kiran H. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States); Powell, Stephanie M. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Magnotta, Vincent A. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Ramme, Austin J. [University of Iowa Hospitals and Clinics, The University of Iowa, Department of Radiology, Iowa City, IA (United States); Adams, Brian D. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Orthopaedics and Rehabilitation, Iowa City, IA (United States); Grosland, Nicole M. [The University of Iowa, Department of Biomedical Engineering, Seamans Center for the Engineering Arts and Sciences, Iowa City, IA (United States); University of Iowa Hospitals and Clinics, The University of Iowa, Department of Orthopaedics and Rehabilitation, Iowa City, IA (United States); The University of Iowa, Center for Computer-Aided Design, Iowa City, IA (United States)

    2008-04-15

    The objective was to develop tools for automating the identification of bony structures, to assess the reliability of this technique against manual raters, and to validate the resulting regions of interest against physical surface scans obtained from the same specimen. Artificial intelligence-based algorithms have been used for image segmentation, specifically artificial neural networks (ANNs). For this study, an ANN was created and trained to identify the phalanges of the human hand. The relative overlap between the ANN and a manual tracer was 0.87, 0.82, and 0.76, for the proximal, middle, and distal index phalanx bones respectively. Compared with the physical surface scans, the ANN-generated surface representations differed on average by 0.35 mm, 0.29 mm, and 0.40 mm for the proximal, middle, and distal phalanges respectively. Furthermore, the ANN proved to segment the structures in less than one-tenth of the time required by a manual rater. The ANN has proven to be a reliable and valid means of segmenting the phalanx bones from CT images. Employing automated methods such as the ANN for segmentation, eliminates the likelihood of rater drift and inter-rater variability. Automated methods also decrease the amount of time and manual effort required to extract the data of interest, thereby making the feasibility of patient-specific modeling a reality. (orig.)

  16. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  17. Multichannel Mars Organic Analyzer (McMOA): Microfluidic Networks for the Automated In Situ Microchip Electrophoretic Analysis of Organic Biomarkers on Mars

    Science.gov (United States)

    Chiesl, T. N.; Benhabib, M.; Stockton, A. M.; Mathies, R. A.

    2010-04-01

    We present the Multichannel Mars Organic Analyzer (McMOA) for the analysis of Amino Acids, PAHs, and Oxidized Carbon. Microfluidic architecures integrating automated metering, mixing, on chip reactions, and serial dilutions are also discussed.

  18. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  19. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  20. Automated generation of node-splitting models for assessment of inconsistency in network meta-analysis.

    Science.gov (United States)

    van Valkenhoef, Gert; Dias, Sofia; Ades, A E; Welton, Nicky J

    2016-03-01

    Network meta-analysis enables the simultaneous synthesis of a network of clinical trials comparing any number of treatments. Potential inconsistencies between estimates of relative treatment effects are an important concern, and several methods to detect inconsistency have been proposed. This paper is concerned with the node-splitting approach, which is particularly attractive because of its straightforward interpretation, contrasting estimates from both direct and indirect evidence. However, node-splitting analyses are labour-intensive because each comparison of interest requires a separate model. It would be advantageous if node-splitting models could be estimated automatically for all comparisons of interest. We present an unambiguous decision rule to choose which comparisons to split, and prove that it selects only comparisons in potentially inconsistent loops in the network, and that all potentially inconsistent loops in the network are investigated. Moreover, the decision rule circumvents problems with the parameterisation of multi-arm trials, ensuring that model generation is trivial in all cases. Thus, our methods eliminate most of the manual work involved in using the node-splitting approach, enabling the analyst to focus on interpreting the results. © 2015 The Authors Research Synthesis Methods Published by John Wiley & Sons Ltd.

  1. Automated Detection of Clinically Significant Prostate Cancer in mp-MRI Images Based on an End-to-End Deep Neural Network.

    Science.gov (United States)

    Wang, Zhiwei; Liu, Chaoyue; Cheng, Danpeng; Wang, Liang; Yang, Xin; Cheng, Kwang-Ting

    2018-05-01

    Automated methods for detecting clinically significant (CS) prostate cancer (PCa) in multi-parameter magnetic resonance images (mp-MRI) are of high demand. Existing methods typically employ several separate steps, each of which is optimized individually without considering the error tolerance of other steps. As a result, they could either involve unnecessary computational cost or suffer from errors accumulated over steps. In this paper, we present an automated CS PCa detection system, where all steps are optimized jointly in an end-to-end trainable deep neural network. The proposed neural network consists of concatenated subnets: 1) a novel tissue deformation network (TDN) for automated prostate detection and multimodal registration and 2) a dual-path convolutional neural network (CNN) for CS PCa detection. Three types of loss functions, i.e., classification loss, inconsistency loss, and overlap loss, are employed for optimizing all parameters of the proposed TDN and CNN. In the training phase, the two nets mutually affect each other and effectively guide registration and extraction of representative CS PCa-relevant features to achieve results with sufficient accuracy. The entire network is trained in a weakly supervised manner by providing only image-level annotations (i.e., presence/absence of PCa) without exact priors of lesions' locations. Compared with most existing systems which require supervised labels, e.g., manual delineation of PCa lesions, it is much more convenient for clinical usage. Comprehensive evaluation based on fivefold cross validation using 360 patient data demonstrates that our system achieves a high accuracy for CS PCa detection, i.e., a sensitivity of 0.6374 and 0.8978 at 0.1 and 1 false positives per normal/benign patient.

  2. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  3. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    Science.gov (United States)

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  4. Understanding the context of network traffic alerts

    NARCIS (Netherlands)

    Cappers, B.C.M.; van Wijk, J.J.; Best, D.M.; Staheli, D.; Prigent, N.; Engle, S.; Harrison, L.

    2016-01-01

    For the protection of critical infrastructures against complex virus attacks, automated network traffic analysis and deep packet inspection are unavoidable. However, even with the use of network intrusion detection systems, the number of alerts is still too large to analyze manually. In addition,

  5. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  6. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  7. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    The rapidly increasing volume of asteroseismic observations on solar-type stars has revealed a need for automated analysis tools. The reason for this is not only that individual analyses of single stars are rather time consuming, but more importantly that these large volumes of observations open...... are calculated in a consistent way. Here we present a set of automated asterosesimic analysis tools. The main engine of these set of tools is an algorithm for modelling the autocovariance spectra of the stellar acoustic spectra allowing us to measure not only the frequency of maximum power and the large......, radius, luminosity, effective temperature, surface gravity and age based on grid modeling. All the tools take into account the window function of the observations which means that they work equally well for space-based photometry observations from e.g. the NASA Kepler satellite and ground-based velocity...

  8. Troubleshooting of Software Defined Networks

    OpenAIRE

    Sistek, Haris

    2015-01-01

    Network troubleshooting is a field where automation is sorely needed. While the network has grown in many other ways since 1960s, the tools we use to troubleshoot and manage it have stayed very much the same. Could we use the programmability of SDN to automate this problem? In this thesis work, we developed a prototype that would systematically troubleshoot the network with automation. The prototype automatically captures network behaviour, matches it against a network policy and creates its ...

  9. The effects of local street network characteristics on the positional accuracy of automated geocoding for geographic health studies

    Directory of Open Access Journals (Sweden)

    Zimmerman Dale L

    2010-02-01

    Full Text Available Abstract Background Automated geocoding of patient addresses for the purpose of conducting spatial epidemiologic studies results in positional errors. It is well documented that errors tend to be larger in rural areas than in cities, but possible effects of local characteristics of the street network, such as street intersection density and street length, on errors have not yet been documented. Our study quantifies effects of these local street network characteristics on the means and the entire probability distributions of positional errors, using regression methods and tolerance intervals/regions, for more than 6000 geocoded patient addresses from an Iowa county. Results Positional errors were determined for 6376 addresses in Carroll County, Iowa, as the vector difference between each 100%-matched automated geocode and its ground-truthed location. Mean positional error magnitude was inversely related to proximate street intersection density. This effect was statistically significant for both rural and municipal addresses, but more so for the former. Also, the effect of street segment length on geocoding accuracy was statistically significant for municipal, but not rural, addresses; for municipal addresses mean error magnitude increased with length. Conclusion Local street network characteristics may have statistically significant effects on geocoding accuracy in some places, but not others. Even in those locales where their effects are statistically significant, street network characteristics may explain a relatively small portion of the variability among geocoding errors. It appears that additional factors besides rurality and local street network characteristics affect accuracy in general.

  10. Automated Image Analysis Corrosion Working Group Update: February 1, 2018

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-01

    These are slides for the automated image analysis corrosion working group update. The overall goals were: automate the detection and quantification of features in images (faster, more accurate), how to do this (obtain data, analyze data), focus on Laser Scanning Confocal Microscope (LCM) data (laser intensity, laser height/depth, optical RGB, optical plus laser RGB).

  11. Automated Microfluidic Platform for Serial Polymerase Chain Reaction and High-Resolution Melting Analysis.

    Science.gov (United States)

    Cao, Weidong; Bean, Brian; Corey, Scott; Coursey, Johnathan S; Hasson, Kenton C; Inoue, Hiroshi; Isano, Taisuke; Kanderian, Sami; Lane, Ben; Liang, Hongye; Murphy, Brian; Owen, Greg; Shinoda, Nobuhiko; Zeng, Shulin; Knight, Ivor T

    2016-06-01

    We report the development of an automated genetic analyzer for human sample testing based on microfluidic rapid polymerase chain reaction (PCR) with high-resolution melting analysis (HRMA). The integrated DNA microfluidic cartridge was used on a platform designed with a robotic pipettor system that works by sequentially picking up different test solutions from a 384-well plate, mixing them in the tips, and delivering mixed fluids to the DNA cartridge. A novel image feedback flow control system based on a Canon 5D Mark II digital camera was developed for controlling fluid movement through a complex microfluidic branching network without the use of valves. The same camera was used for measuring the high-resolution melt curve of DNA amplicons that were generated in the microfluidic chip. Owing to fast heating and cooling as well as sensitive temperature measurement in the microfluidic channels, the time frame for PCR and HRMA was dramatically reduced from hours to minutes. Preliminary testing results demonstrated that rapid serial PCR and HRMA are possible while still achieving high data quality that is suitable for human sample testing. © 2015 Society for Laboratory Automation and Screening.

  12. Unified Tractable Model for Large-Scale Networks Using Stochastic Geometry: Analysis and Design

    KAUST Repository

    Afify, Laila H.

    2016-12-01

    The ever-growing demands for wireless technologies necessitate the evolution of next generation wireless networks that fulfill the diverse wireless users requirements. However, upscaling existing wireless networks implies upscaling an intrinsic component in the wireless domain; the aggregate network interference. Being the main performance limiting factor, it becomes crucial to develop a rigorous analytical framework to accurately characterize the out-of-cell interference, to reap the benefits of emerging networks. Due to the different network setups and key performance indicators, it is essential to conduct a comprehensive study that unifies the various network configurations together with the different tangible performance metrics. In that regard, the focus of this thesis is to present a unified mathematical paradigm, based on Stochastic Geometry, for large-scale networks with different antenna/network configurations. By exploiting such a unified study, we propose an efficient automated network design strategy to satisfy the desired network objectives. First, this thesis studies the exact aggregate network interference characterization, by accounting for each of the interferers signals in the large-scale network. Second, we show that the information about the interferers symbols can be approximated via the Gaussian signaling approach. The developed mathematical model presents twofold analysis unification for uplink and downlink cellular networks literature. It aligns the tangible decoding error probability analysis with the abstract outage probability and ergodic rate analysis. Furthermore, it unifies the analysis for different antenna configurations, i.e., various multiple-input multiple-output (MIMO) systems. Accordingly, we propose a novel reliable network design strategy that is capable of appropriately adjusting the network parameters to meet desired design criteria. In addition, we discuss the diversity-multiplexing tradeoffs imposed by differently favored

  13. An Immune-inspired Adaptive Automated Intrusion Response System Model

    Directory of Open Access Journals (Sweden)

    Ling-xi Peng

    2012-09-01

    Full Text Available An immune-inspired adaptive automated intrusion response system model, named as , is proposed. The descriptions of self, non-self, immunocyte, memory detector, mature detector and immature detector of the network transactions, and the realtime network danger evaluation equations are given. Then, the automated response polices are adaptively performed or adjusted according to the realtime network danger. Thus, not only accurately evaluates the network attacks, but also greatly reduces the response times and response costs.

  14. The Deep Space Network information system in the year 2000

    Science.gov (United States)

    Markley, R. W.; Beswick, C. A.

    1992-01-01

    The Deep Space Network (DSN), the largest, most sensitive scientific communications and radio navigation network in the world, is considered. Focus is made on the telemetry processing, monitor and control, and ground data transport architectures of the DSN ground information system envisioned for the year 2000. The telemetry architecture will be unified from the front-end area to the end user. It will provide highly automated monitor and control of the DSN, automated configuration of support activities, and a vastly improved human interface. Automated decision support systems will be in place for DSN resource management, performance analysis, fault diagnosis, and contingency management.

  15. Automated multivariate analysis of comprehensive two-dimensional gas chromatograms of petroleum

    DEFF Research Database (Denmark)

    Skov, Søren Furbo

    of separated compounds makes the analysis of GCGC chromatograms tricky, as there are too much data for manual analysis , and automated analysis is not always trouble-free: Manual checking of the results is often necessary. In this work, I will investigate the possibility of another approach to analysis of GCGC...... impossible to find it. For a special class of models, multi-way models, unique solutions often exist, meaning that the underlying phenomena can be found. I have tested this class of models on GCGC data from petroleum and conclude that more work is needed before they can be automated. I demonstrate how...

  16. Analysis And Control System For Automated Welding

    Science.gov (United States)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  17. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    OpenAIRE

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2012-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noi...

  18. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  19. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...... of the successive notes and intervals, various sets of musical parameters may be invoked. In this chapter, a method is presented that allows for these heterogeneous patterns to be discovered. Motivic repetition with local ornamentation is detected by reconstructing, on top of “surface-level” monodic voices, longer...

  20. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained....... The major disadvantage of manual FT segmentations, unfortunately, is that placing regions-of-interest for tract selection can be very labor-intensive and time-consuming. Although there are several methods that can identify specific WM fiber bundles in an automated way, manual FT segmentations across...... multiple subjects performed by a trained rater with neuroanatomical expertise are generally assumed to be more accurate. However, for longitudinal DTI analyses it may still be beneficial to automate the FT segmentation across multiple time points, but then for each individual subject separately. Both...

  1. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  2. Evaluation of an automated analysis for pain-related evoked potentials

    Directory of Open Access Journals (Sweden)

    Wulf Michael

    2017-09-01

    Full Text Available This paper presents initial steps towards an auto-mated analysis for pain-related evoked potentials (PREP to achieve a higher objectivity and non-biased examination as well as a reduction in the time expended during clinical daily routines. While manually examining, each epoch of an en-semble of stimulus-locked EEG signals, elicited by electrical stimulation of predominantly intra-epidermal small nerve fibers and recorded over the central electrode (Cz, is in-spected for artifacts before calculating the PREP by averag-ing the artifact-free epochs. Afterwards, specific peak-latencies (like the P0-, N1 and P1-latency are identified as certain extrema in the PREP’s waveform. The proposed automated analysis uses Pearson’s correlation and low-pass differentiation to perform these tasks. To evaluate the auto-mated analysis’ accuracy its results of 232 datasets were compared to the results of the manually performed examina-tion. Results of the automated artifact rejection were compa-rable to the manual examination. Detection of peak-latencies was more heterogeneous, indicating some sensitivity of the detected events upon the criteria used during data examina-tion.

  3. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  4. Instrumentation, Field Network And Process Automation for the LHC Cryogenic Line Tests

    CERN Document Server

    Bager, T; Bertrand, G; Casas-Cubillos, J; Gomes, P; Parente, C; Riddone, G; Suraci, A

    2000-01-01

    This paper describes the cryogenic control system and associated instrumentation of the test facility for 3 pre-series units of the LHC Cryogenic Distribution Line. For each unit, the process automation is based on a Programmable Logic Con-troller implementing more than 30 closed control loops and handling alarms, in-terlocks and overall process management. More than 160 sensors and actuators are distributed over 150 m on a Profibus DP/PA network. Parameterization, cali-bration and diagnosis are remotely available through the bus. Considering the diversity, amount and geographical distribution of the instru-mentation involved, this is a representative approach to the cryogenic control system for CERN's next accelerator.

  5. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  6. Radial basis function (RBF) neural network control for mechanical systems design, analysis and Matlab simulation

    CERN Document Server

    Liu, Jinkun

    2013-01-01

    Radial Basis Function (RBF) Neural Network Control for Mechanical Systems is motivated by the need for systematic design approaches to stable adaptive control system design using neural network approximation-based techniques. The main objectives of the book are to introduce the concrete design methods and MATLAB simulation of stable adaptive RBF neural control strategies. In this book, a broad range of implementable neural network control design methods for mechanical systems are presented, such as robot manipulators, inverted pendulums, single link flexible joint robots, motors, etc. Advanced neural network controller design methods and their stability analysis are explored. The book provides readers with the fundamentals of neural network control system design.   This book is intended for the researchers in the fields of neural adaptive control, mechanical systems, Matlab simulation, engineering design, robotics and automation. Jinkun Liu is a professor at Beijing University of Aeronautics and Astronauti...

  7. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  8. Network analysis literacy a practical approach to the analysis of networks

    CERN Document Server

    Zweig, Katharina A

    2014-01-01

    Network Analysis Literacy focuses on design principles for network analytics projects. The text enables readers to: pose a defined network analytic question; build a network to answer the question; choose or design the right network analytic methods for a particular purpose, and more.

  9. Analyzing security protocols in hierarchical networks

    DEFF Research Database (Denmark)

    Zhang, Ye; Nielson, Hanne Riis

    2006-01-01

    Validating security protocols is a well-known hard problem even in a simple setting of a single global network. But a real network often consists of, besides the public-accessed part, several sub-networks and thereby forms a hierarchical structure. In this paper we first present a process calculus...... capturing the characteristics of hierarchical networks and describe the behavior of protocols on such networks. We then develop a static analysis to automate the validation. Finally we demonstrate how the technique can benefit the protocol development and the design of network systems by presenting a series...

  10. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  11. Deep multi-scale location-aware 3D convolutional neural networks for automated detection of lacunes of presumed vascular origin.

    Science.gov (United States)

    Ghafoorian, Mohsen; Karssemeijer, Nico; Heskes, Tom; Bergkamp, Mayra; Wissink, Joost; Obels, Jiri; Keizer, Karlijn; de Leeuw, Frank-Erik; Ginneken, Bram van; Marchiori, Elena; Platel, Bram

    2017-01-01

    Lacunes of presumed vascular origin (lacunes) are associated with an increased risk of stroke, gait impairment, and dementia and are a primary imaging feature of the small vessel disease. Quantification of lacunes may be of great importance to elucidate the mechanisms behind neuro-degenerative disorders and is recommended as part of study standards for small vessel disease research. However, due to the different appearance of lacunes in various brain regions and the existence of other similar-looking structures, such as perivascular spaces, manual annotation is a difficult, elaborative and subjective task, which can potentially be greatly improved by reliable and consistent computer-aided detection (CAD) routines. In this paper, we propose an automated two-stage method using deep convolutional neural networks (CNN). We show that this method has good performance and can considerably benefit readers. We first use a fully convolutional neural network to detect initial candidates. In the second step, we employ a 3D CNN as a false positive reduction tool. As the location information is important to the analysis of candidate structures, we further equip the network with contextual information using multi-scale analysis and integration of explicit location features. We trained, validated and tested our networks on a large dataset of 1075 cases obtained from two different studies. Subsequently, we conducted an observer study with four trained observers and compared our method with them using a free-response operating characteristic analysis. Shown on a test set of 111 cases, the resulting CAD system exhibits performance similar to the trained human observers and achieves a sensitivity of 0.974 with 0.13 false positives per slice. A feasibility study also showed that a trained human observer would considerably benefit once aided by the CAD system.

  12. Automated Big Traffic Analytics for Cyber Security

    OpenAIRE

    Miao, Yuantian; Ruan, Zichan; Pan, Lei; Wang, Yu; Zhang, Jun; Xiang, Yang

    2018-01-01

    Network traffic analytics technology is a cornerstone for cyber security systems. We demonstrate its use through three popular and contemporary cyber security applications in intrusion detection, malware analysis and botnet detection. However, automated traffic analytics faces the challenges raised by big traffic data. In terms of big data's three characteristics --- volume, variety and velocity, we review three state of the art techniques to mitigate the key challenges including real-time tr...

  13. A5: Automated Analysis of Adversarial Android Applications

    Science.gov (United States)

    2014-06-03

    A5: Automated Analysis of Adversarial Android Applications Timothy Vidas, Jiaqi Tan, Jay Nahata, Chaur Lih Tan, Nicolas Christin...detecting, on the device itself, that an application is malicious is much more complex without elevated privileges . In other words, given the...interface via website. Blasing et al. [7] describe another dynamic analysis system for Android . Their system focuses on classifying input applications as

  14. Planning representation for automated exploratory data analysis

    Science.gov (United States)

    St. Amant, Robert; Cohen, Paul R.

    1994-03-01

    Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.

  15. Meteorological and oceanographic data collected from the National Data Buoy Center Coastal-Marine Automated Network (C-MAN) and moored (weather) buoys

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Data Buoy Center (NDBC) established the Coastal-Marine Automated Network (C-MAN) for the National Weather Service in the early 1980's. NDBC has...

  16. Automated species-level identification and segmentation of planktonic foraminifera using convolutional neural networks

    Science.gov (United States)

    Marchitto, T. M., Jr.; Mitra, R.; Zhong, B.; Ge, Q.; Kanakiya, B.; Lobaton, E.

    2017-12-01

    Identification and picking of foraminifera from sediment samples is often a laborious and repetitive task. Previous attempts to automate this process have met with limited success, but we show that recent advances in machine learning can be brought to bear on the problem. As a `proof of concept' we have developed a system that is capable of recognizing six species of extant planktonic foraminifera that are commonly used in paleoceanographic studies. Our pipeline begins with digital photographs taken under 16 different illuminations using an LED ring, which are then fused into a single 3D image. Labeled image sets were used to train various types of image classification algorithms, and performance on unlabeled image sets was measured in terms of precision (whether IDs are correct) and recall (what fraction of the target species are found). We find that Convolutional Neural Network (CNN) approaches achieve precision and recall values between 80 and 90%, which is similar precision and better recall than human expert performance using the same type of photographs. We have also trained a CNN to segment the 3D images into individual chambers and apertures, which can not only improve identification performance but also automate the measurement of foraminifera for morphometric studies. Given that there are only 35 species of extant planktonic foraminifera larger than 150 μm, we suggest that a fully automated characterization of this assemblage is attainable. This is the first step toward the realization of a foram picking robot.

  17. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Franke, Daniel; Kikhney, Alexey G.; Svergun, Dmitri I.

    2012-01-01

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  18. Automated voxel-based analysis of brain perfusion SPECT for vasospasm after subarachnoid haemorrhage

    International Nuclear Information System (INIS)

    Iwabuchi, S.; Yokouchi, T.; Hayashi, M.; Kimura, H.; Tomiyama, A.; Hirata, Y.; Saito, N.; Harashina, J.; Nakayama, H.; Sato, K.; Aoki, K.; Samejima, H.; Ueda, M.; Terada, H.; Hamazaki, K.

    2008-01-01

    We evaluated regional cerebral blood flow (rCBF) during vasospasm after subarachnoid haemorrhage ISAH) using automated voxel-based analysis of brain perfusion single-photon emission computed tomography (SPELT). Brain perfusion SPECT was performed 7 to 10 days after onset of SAH. Automated voxel-based analysis of SPECT used a Z-score map that was calculated by comparing the patients data with a control database. In cases where computed tomography (CT) scans detected an ischemic region due to vasospasm, automated voxel-based analysis of brain perfusion SPECT revealed dramatically reduced rCBF (Z-score ≤ -4). No patients with mildly or moderately diminished rCBF (Z-score > -3) progressed to cerebral infarction. Some patients with a Z-score < -4 did not progress to cerebral infarction after active treatment with a angioplasty. Three-dimensional images provided detailed anatomical information and helped us to distinguish surgical sequelae from vasospasm. In conclusion, automated voxel-based analysis of brain perfusion SPECT using a Z-score map is helpful in evaluating decreased rCBF due to vasospasm. (author)

  19. Stochastic Petri nets for the reliability analysis of communication network applications with alternate-routing

    International Nuclear Information System (INIS)

    Balakrishnan, Meera; Trivedi, Kishor S.

    1996-01-01

    In this paper, we present a comparative reliability analysis of an application on a corporate B-ISDN network under various alternate-routing protocols. For simple cases, the reliability problem can be cast into fault-tree models and solved rapidly by means of known methods. For more complex scenarios, state space (Markov) models are required. However, generation of large state space models can get very labor intensive and error prone. We advocate the use of stochastic reward nets (a variant of stochastic Petri nets) for the concise specification, automated generation and solution of alternate-routing protocols in networks. This paper is written in a tutorial style so as to make it accessible to a large audience

  20. Structural brain network analysis in families multiply affected with bipolar I disorder.

    Science.gov (United States)

    Forde, Natalie J; O'Donoghue, Stefani; Scanlon, Cathy; Emsell, Louise; Chaddock, Chris; Leemans, Alexander; Jeurissen, Ben; Barker, Gareth J; Cannon, Dara M; Murray, Robin M; McDonald, Colm

    2015-10-30

    Disrupted structural connectivity is associated with psychiatric illnesses including bipolar disorder (BP). Here we use structural brain network analysis to investigate connectivity abnormalities in multiply affected BP type I families, to assess the utility of dysconnectivity as a biomarker and its endophenotypic potential. Magnetic resonance diffusion images for 19 BP type I patients in remission, 21 of their first degree unaffected relatives, and 18 unrelated healthy controls underwent tractography. With the automated anatomical labelling atlas being used to define nodes, a connectivity matrix was generated for each subject. Network metrics were extracted with the Brain Connectivity Toolbox and then analysed for group differences, accounting for potential confounding effects of age, gender and familial association. Whole brain analysis revealed no differences between groups. Analysis of specific mainly frontal regions, previously implicated as potentially endophenotypic by functional magnetic resonance imaging analysis of the same cohort, revealed a significant effect of group in the right medial superior frontal gyrus and left middle frontal gyrus driven by reduced organisation in patients compared with controls. The organisation of whole brain networks of those affected with BP I does not differ from their unaffected relatives or healthy controls. In discreet frontal regions, however, anatomical connectivity is disrupted in patients but not in their unaffected relatives. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.

    1998-01-01

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as well...

  2. Traffic Analysis for Real-Time Communication Networks onboard Ships

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens Frederik Dalsgaard; Jørgensen, N.

    The paper presents a novel method for establishing worst case estimates of queue lenghts and transmission delays in networks of interconnected segments each of ring topology as defined by the ATOMOS project for marine automation. A non probalistic model for describing traffic is introduced as well...

  3. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. 1261.413 Section 1261.413 Aeronautics and Space NATIONAL...) § 1261.413 Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. The...

  4. Artificial Neural Network for Total Laboratory Automation to Improve the Management of Sample Dilution.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.

  5. Automated detection of masses on whole breast volume ultrasound scanner: false positive reduction using deep convolutional neural network

    Science.gov (United States)

    Hiramatsu, Yuya; Muramatsu, Chisako; Kobayashi, Hironobu; Hara, Takeshi; Fujita, Hiroshi

    2017-03-01

    Breast cancer screening with mammography and ultrasonography is expected to improve sensitivity compared with mammography alone, especially for women with dense breast. An automated breast volume scanner (ABVS) provides the operator-independent whole breast data which facilitate double reading and comparison with past exams, contralateral breast, and multimodality images. However, large volumetric data in screening practice increase radiologists' workload. Therefore, our goal is to develop a computer-aided detection scheme of breast masses in ABVS data for assisting radiologists' diagnosis and comparison with mammographic findings. In this study, false positive (FP) reduction scheme using deep convolutional neural network (DCNN) was investigated. For training DCNN, true positive and FP samples were obtained from the result of our initial mass detection scheme using the vector convergence filter. Regions of interest including the detected regions were extracted from the multiplanar reconstraction slices. We investigated methods to select effective FP samples for training the DCNN. Based on the free response receiver operating characteristic analysis, simple random sampling from the entire candidates was most effective in this study. Using DCNN, the number of FPs could be reduced by 60%, while retaining 90% of true masses. The result indicates the potential usefulness of DCNN for FP reduction in automated mass detection on ABVS images.

  6. SAHM - Simplification of one-dimensional hydraulic networks by automated processes evaluated on 1D/2D deterministic flood models

    DEFF Research Database (Denmark)

    Löwe, Roland; Davidsen, Steffen; Thrysøe, Cecilie

    We present an algorithm for automated simplification of 1D pipe network models. The impact of the simplifications on the flooding simulated by coupled 1D-2D models is evaluated in an Australian case study. Significant reductions of the simulation time of the coupled model are achieved by reducing...... the 1D network model. The simplifications lead to an underestimation of flooded area because interaction points between network and surface are removed and because water is transported downstream faster. These effects can be mitigated by maintaining nodes in flood-prone areas in the simplification...... and by adjusting pipe roughness to increase transport times....

  7. Automated aerosol sampling and analysis for the Comprehensive Test Ban Treaty

    International Nuclear Information System (INIS)

    Miley, H.S.; Bowyer, S.M.; Hubbard, C.W.; McKinnon, A.D.; Perkins, R.W.; Thompson, R.C.; Warner, R.A.

    1998-01-01

    Detecting nuclear debris from a nuclear weapon exploded in or substantially vented to the Earth's atmosphere constitutes the most certain indication that a violation of the Comprehensive Test Ban Treaty has occurred. For this reason, a radionuclide portion of the International Monitoring System is being designed and implemented. The IMS will monitor aerosols and gaseous xenon isotopes to detect atmospheric and underground tests, respectively. An automated system, the Radionuclide Aerosol Sampler/Analyzer (RASA), has been developed at Pacific Northwest National Laboratory to meet CTBT aerosol measurement requirements. This is achieved by the use of a novel sampling apparatus, a high-resolution germanium detector, and very sophisticated software. This system draws a large volume of air (∼ 20,000 m 3 /day), performs automated gamma-ray spectral measurements (MDC( 140 Ba) 3 ), and communicates this and other data to a central data facility. Automated systems offer the added benefit of rigid controls, easily implemented QA/QC procedures, and centralized depot maintenance and operation. Other types of automated communication include pull or push transmission of State-Of-Health data, commands, and configuration data. In addition, a graphical user interface, Telnet, and other interactive communications are supported over ordinary phone or network lines. This system has been the subject of a USAF commercialization effort to meet US CTBT monitoring commitments. It will also be available to other CTBT signatories and the monitoring community for various governmental, environmental, or commercial needs. The current status of the commercialization is discussed

  8. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    Science.gov (United States)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with

  9. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    Science.gov (United States)

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  10. An Automation System for Optimizing a Supply Chain Network Design under the Influence of Demand Uncertainty

    OpenAIRE

    Polany, Rany

    2012-01-01

    This research develops and applies an integrated hierarchical framework for modeling a multi-echelon supply chain network design, under the influence of demand uncertainty. The framework is a layered integration of two levels: macro, high-level scenario planning combined with micro, low-level Monte Carlo simulation of uncertainties in demand. To facilitate rapid simulation of the effects of demand uncertainty, the integrated framework was implemented as a dashboard automation system using Mic...

  11. Networks and network analysis for defence and security

    CERN Document Server

    Masys, Anthony J

    2014-01-01

    Networks and Network Analysis for Defence and Security discusses relevant theoretical frameworks and applications of network analysis in support of the defence and security domains. This book details real world applications of network analysis to support defence and security. Shocks to regional, national and global systems stemming from natural hazards, acts of armed violence, terrorism and serious and organized crime have significant defence and security implications. Today, nations face an uncertain and complex security landscape in which threats impact/target the physical, social, economic

  12. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  13. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  14. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, Mogens; Borch, Ole; Bagnoli, F.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  15. Automated Information System (AIS) Alarm System

    International Nuclear Information System (INIS)

    Hunteman, W.

    1997-01-01

    The Automated Information Alarm System is a joint effort between Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratory to demonstrate and implement, on a small-to-medium sized local area network, an automated system that detects and automatically responds to attacks that use readily available tools and methodologies. The Alarm System will sense or detect, assess, and respond to suspicious activities that may be detrimental to information on the network or to continued operation of the network. The responses will allow stopping, isolating, or ejecting the suspicious activities. The number of sensors, the sensitivity of the sensors, the assessment criteria, and the desired responses may be set by the using organization to meet their local security policies

  16. Automated Information System (AIS) Alarm System

    Energy Technology Data Exchange (ETDEWEB)

    Hunteman, W.

    1997-05-01

    The Automated Information Alarm System is a joint effort between Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratory to demonstrate and implement, on a small-to-medium sized local area network, an automated system that detects and automatically responds to attacks that use readily available tools and methodologies. The Alarm System will sense or detect, assess, and respond to suspicious activities that may be detrimental to information on the network or to continued operation of the network. The responses will allow stopping, isolating, or ejecting the suspicious activities. The number of sensors, the sensitivity of the sensors, the assessment criteria, and the desired responses may be set by the using organization to meet their local security policies.

  17. Automated system for load flow prediction in power substations using artificial neural networks

    Directory of Open Access Journals (Sweden)

    Arlys Michel Lastre Aleaga

    2015-09-01

    Full Text Available The load flow is of great importance in assisting the process of decision making and planning of generation, distribution and transmission of electricity. Ignorance of the values in this indicator, as well as their inappropriate prediction, difficult decision making and efficiency of the electricity service, and can cause undesirable situations such as; the on demand, overheating of the components that make up a substation, and incorrect planning processes electricity generation and distribution. Given the need for prediction of flow of electric charge of the substations in Ecuador this research proposes the concept for the development of an automated prediction system employing the use of Artificial Neural Networks.

  18. Comparative performance of some popular artificial neural network ...

    Indian Academy of Sciences (India)

    tificial neural network domain (viz., local search algorithms, global search ... branches of astronomy for automated data analysis and other applications like ...... such as standard backpropagation, fuzzy logic, genetic algorithms, fractals etc.,.

  19. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  20. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik

    2015-01-01

    in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...

  1. Development of a robotics system for automated chemical analysis of sediments, sludges, and soils

    International Nuclear Information System (INIS)

    McGrail, B.P.; Dodson, M.G.; Skorpik, J.R.; Strachan, D.M.; Barich, J.J.

    1989-01-01

    Adaptation and use of a high-reliability robot to conduct a standard laboratory procedure for soil chemical analysis are reported. Results from a blind comparative test were used to obtain a quantitative measure of the improvement in precision possible with the automated test method. Results from the automated chemical analysis procedure were compared with values obtained from an EPA-certified lab and with results from a more extensive interlaboratory round robin conducted by the EPA. For several elements, up to fivefold improvement in precision was obtained with the automated test method

  2. Application Filters for TCP/IP Industrial Automation Protocols

    Science.gov (United States)

    Batista, Aguinaldo B.; Kobayashi, Tiago H.; Medeiros, João Paulo S.; Brito, Agostinho M.; Motta Pires, Paulo S.

    The use of firewalls is a common approach usually meant to secure Automation Technology (AT) from Information Technology (TI) networks. This work proposes a filtering system for TCP/IP-based automation networks in which only certain kind of industrial traffic is permitted. All network traffic which does not conform with a proper industrial protocol pattern or with specific rules for its actions is supposed to be abnormal and must be blocked. As a case study, we developed a seventh layer firewall application with the ability of blocking spurious traffic, using an IP packet queueing engine and a regular expression library.

  3. Automated metabolic gas analysis systems: a review.

    Science.gov (United States)

    Macfarlane, D J

    2001-01-01

    The use of automated metabolic gas analysis systems or metabolic measurement carts (MMC) in exercise studies is common throughout the industrialised world. They have become essential tools for diagnosing many hospital patients, especially those with cardiorespiratory disease. Moreover, the measurement of maximal oxygen uptake (VO2max) is routine for many athletes in fitness laboratories and has become a defacto standard in spite of its limitations. The development of metabolic carts has also facilitated the noninvasive determination of the lactate threshold and cardiac output, respiratory gas exchange kinetics, as well as studies of outdoor activities via small portable systems that often use telemetry. Although the fundamental principles behind the measurement of oxygen uptake (VO2) and carbon dioxide production (VCO2) have not changed, the techniques used have, and indeed, some have almost turned through a full circle. Early scientists often employed a manual Douglas bag method together with separate chemical analyses, but the need for faster and more efficient techniques fuelled the development of semi- and full-automated systems by private and commercial institutions. Yet, recently some scientists are returning back to the traditional Douglas bag or Tissot-spirometer methods, or are using less complex automated systems to not only save capital costs, but also to have greater control over the measurement process. Over the last 40 years, a considerable number of automated systems have been developed, with over a dozen commercial manufacturers producing in excess of 20 different automated systems. The validity and reliability of all these different systems is not well known, with relatively few independent studies having been published in this area. For comparative studies to be possible and to facilitate greater consistency of measurements in test-retest or longitudinal studies of individuals, further knowledge about the performance characteristics of these

  4. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  5. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  6. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  7. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  8. Self-Powered Wireless Sensor Network for Automated Corrosion Prediction of Steel Reinforcement

    Directory of Open Access Journals (Sweden)

    Dan Su

    2018-01-01

    Full Text Available Corrosion is one of the key issues that affect the service life and hinders wide application of steel reinforcement. Moreover, corrosion is a long-term process and not visible for embedded reinforcement. Thus, this research aims at developing a self-powered smart sensor system with integrated innovative prediction module for forecasting corrosion process of embedded steel reinforcement. Vibration-based energy harvester is used to harvest energy for continuous corrosion data collection. Spatial interpolation module was developed to interpolate corrosion data at unmonitored locations. Dynamic prediction module is used to predict the long-term corrosion based on collected data. Utilizing this new sensor network, the corrosion process can be automated predicted and appropriate mitigation actions will be recommended accordingly.

  9. Network-Based Real-time Integrated Fire Detection and Alarm (FDA) System with Building Automation

    Science.gov (United States)

    Anwar, F.; Boby, R. I.; Rashid, M. M.; Alam, M. M.; Shaikh, Z.

    2017-11-01

    Fire alarm systems have become increasingly an important lifesaving technology in many aspects, such as applications to detect, monitor and control any fire hazard. A large sum of money is being spent annually to install and maintain the fire alarm systems in buildings to protect property and lives from the unexpected spread of fire. Several methods are already developed and it is improving on a daily basis to reduce the cost as well as increase quality. An integrated Fire Detection and Alarm (FDA) systems with building automation was studied, to reduce cost and improve their reliability by preventing false alarm. This work proposes an improved framework for FDA system to ensure a robust intelligent network of FDA control panels in real-time. A shortest path algorithmic was chosen for series of buildings connected by fiber optic network. The framework shares information and communicates with each fire alarm panels connected in peer to peer configuration and declare the network state using network address declaration from any building connected in network. The fiber-optic connection was proposed to reduce signal noises, thus increasing large area coverage, real-time communication and long-term safety. Based on this proposed method an experimental setup was designed and a prototype system was developed to validate the performance in practice. Also, the distributed network system was proposed to connect with an optional remote monitoring terminal panel to validate proposed network performance and ensure fire survivability where the information is sequentially transmitted. The proposed FDA system is different from traditional fire alarm and detection system in terms of topology as it manages group of buildings in an optimal and efficient manner.Introduction

  10. Automated analysis of autoradiographic imagery

    International Nuclear Information System (INIS)

    Bisignani, W.T.; Greenhouse, S.C.

    1975-01-01

    A research programme is described which has as its objective the automated characterization of neurological tissue regions from autoradiographs by utilizing hybrid-resolution image processing techniques. An experimental system is discussed which includes raw imagery, scanning an digitizing equipments, feature-extraction algorithms, and regional characterization techniques. The parameters extracted by these algorithms are presented as well as the regional characteristics which are obtained by operating on the parameters with statistical sampling techniques. An approach is presented for validating the techniques and initial experimental results are obtained from an anlysis of an autoradiograph of a region of the hypothalamus. An extension of these automated techniques to other biomedical research areas is discussed as well as the implications of applying automated techniques to biomedical research problems. (author)

  11. Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.

    Science.gov (United States)

    Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A

    2011-04-01

    Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Semi-automated retinal vessel analysis in nonmydriatic fundus photography.

    Science.gov (United States)

    Schuster, Alexander Karl-Georg; Fischer, Joachim Ernst; Vossmerbaeumer, Urs

    2014-02-01

    Funduscopic assessment of the retinal vessels may be used to assess the health status of microcirculation and as a component in the evaluation of cardiovascular risk factors. Typically, the evaluation is restricted to morphological appreciation without strict quantification. Our purpose was to develop and validate a software tool for semi-automated quantitative analysis of retinal vasculature in nonmydriatic fundus photography. matlab software was used to develop a semi-automated image recognition and analysis tool for the determination of the arterial-venous (A/V) ratio in the central vessel equivalent on 45° digital fundus photographs. Validity and reproducibility of the results were ascertained using nonmydriatic photographs of 50 eyes from 25 subjects recorded from a 3DOCT device (Topcon Corp.). Two hundred and thirty-three eyes of 121 healthy subjects were evaluated to define normative values. A software tool was developed using image thresholds for vessel recognition and vessel width calculation in a semi-automated three-step procedure: vessel recognition on the photograph and artery/vein designation, width measurement and calculation of central retinal vessel equivalents. Mean vessel recognition rate was 78%, vessel class designation rate 75% and reproducibility between 0.78 and 0.91. Mean A/V ratio was 0.84. Application on a healthy norm cohort showed high congruence with prior published manual methods. Processing time per image was one minute. Quantitative geometrical assessment of the retinal vasculature may be performed in a semi-automated manner using dedicated software tools. Yielding reproducible numerical data within a short time leap, this may contribute additional value to mere morphological estimates in the clinical evaluation of fundus photographs. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  13. Automated reasoning applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System

  14. Automated analysis of gastric emptying

    International Nuclear Information System (INIS)

    Abutaleb, A.; Frey, D.; Spicer, K.; Spivey, M.; Buckles, D.

    1986-01-01

    The authors devised a novel method to automate the analysis of nuclear gastric emptying studies. Many previous methods have been used to measure gastric emptying but, are cumbersome and require continuing interference by the operator to use. Two specific problems that occur are related to patient movement between images and changes in the location of the radioactive material within the stomach. Their method can be used with either dual or single phase studies. For dual phase studies the authors use In-111 labeled water and Tc-99MSC (Sulfur Colloid) labeled scrambled eggs. For single phase studies either the liquid or solid phase material is used

  15. Retina Image Analysis and Ocular Telehealth: The Oak Ridge National Laboratory-Hamilton Eye Institute Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Karnowski, Thomas Paul [ORNL; Giancardo, Luca [ORNL; Li, Yaquin [University of Tennessee, Knoxville (UTK); Tobin Jr, Kenneth William [ORNL; Chaum, Edward [University of Tennessee, Knoxville (UTK)

    2013-01-01

    Automated retina image analysis has reached a high level of maturity in recent years, and thus the question of how validation is performed in these systems is beginning to grow in importance. One application of retina image analysis is in telemedicine, where an automated system could enable the automated detection of diabetic retinopathy and other eye diseases as a low-cost method for broad-based screening. In this work we discuss our experiences in developing a telemedical network for retina image analysis, including our progression from a manual diagnosis network to a more fully automated one. We pay special attention to how validations of our algorithm steps are performed, both using data from the telemedicine network and other public databases.

  16. Cyber Security Research Frameworks For Coevolutionary Network Defense

    Energy Technology Data Exchange (ETDEWEB)

    Rush, George D. [Missouri Univ. of Science and Technology, Rolla, MO (United States); Tauritz, Daniel Remy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-03

    Several architectures have been created for developing and testing systems used in network security, but most are meant to provide a platform for running cyber security experiments as opposed to automating experiment processes. In the first paper, we propose a framework termed Distributed Cyber Security Automation Framework for Experiments (DCAFE) that enables experiment automation and control in a distributed environment. Predictive analysis of adversaries is another thorny issue in cyber security. Game theory can be used to mathematically analyze adversary models, but its scalability limitations restrict its use. Computational game theory allows us to scale classical game theory to larger, more complex systems. In the second paper, we propose a framework termed Coevolutionary Agent-based Network Defense Lightweight Event System (CANDLES) that can coevolve attacker and defender agent strategies and capabilities and evaluate potential solutions with a custom network defense simulation. The third paper is a continuation of the CANDLES project in which we rewrote key parts of the framework. Attackers and defenders have been redesigned to evolve pure strategy, and a new network security simulation is devised which specifies network architecture and adds a temporal aspect. We also add a hill climber algorithm to evaluate the search space and justify the use of a coevolutionary algorithm.

  17. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    Science.gov (United States)

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  18. Digital coal mine integrated automation system based on Controlnet

    Energy Technology Data Exchange (ETDEWEB)

    Jin-yun Chen; Shen Zhang; Wei-ran Zuo [China University of Mining and Technology, Xuzhou (China). School of Chemical Engineering and Technology

    2007-06-15

    A three-layer model for digital communication in a mine is proposed. Two basic platforms are discussed: a uniform transmission network and a uniform data warehouse. An actual, ControlNet based, transmission network platform suitable for the Jining No.3 coal mine in China is presented. This network is an information superhighway intended to integrate all existing and new automation subsystems. Its standard interface can be used with future subsystems. The network, data structure and management decision-making all employ this uniform hardware and software. This effectively avoids the problems of system and information islands seen in traditional mine-automation systems. The construction of the network provides a stable foundation for digital communication in the Jining No.3 coal mine. 9 refs., 5 figs.

  19. Automated information retrieval system for radioactivation analysis

    International Nuclear Information System (INIS)

    Lambrev, V.G.; Bochkov, P.E.; Gorokhov, S.A.; Nekrasov, V.V.; Tolstikova, L.I.

    1981-01-01

    An automated information retrieval system for radioactivation analysis has been developed. An ES-1022 computer and a problem-oriented software ''The description information search system'' were used for the purpose. Main aspects and sources of forming the system information fund, characteristics of the information retrieval language of the system are reported and examples of question-answer dialogue are given. Two modes can be used: selective information distribution and retrospective search [ru

  20. Automated mineralogy and petrology - applications of TESCAN Integrated Mineral Analyzer (TIMA)

    Czech Academy of Sciences Publication Activity Database

    Hrstka, Tomáš; Gottlieb, P.; Skála, Roman; Breiter, Karel; Motl, D.

    2018-01-01

    Roč. 63, č. 1 (2018), s. 47-63 ISSN 1802-6222 Grant - others:AV ČR(CZ) StrategieAV21/4 Program:StrategieAV Institutional support: RVO:67985831 Keywords : TIMA * Automated SEM/EDS * applied mineralogy * modal analysis * artificial intelligence * neural networks Subject RIV: DB - Geology ; Mineralogy OBOR OECD: Mineralogy Impact factor: 0.609, year: 2016

  1. Network Analysis Tools: from biological networks to clusters and pathways.

    Science.gov (United States)

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  2. Pore network quantification of sandstones under experimental CO2 injection using image analysis

    Science.gov (United States)

    Berrezueta, Edgar; González-Menéndez, Luís; Ordóñez-Casado, Berta; Olaya, Peter

    2015-04-01

    Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. Our case study is focused on the application of these techniques to study the evolution of rock pore network subjected to super critical CO2-injection. We have proposed a Digital Image Analysis (DIA) protocol that guarantees measurement reproducibility and reliability. This can be summarized in the following stages: (i) detailed description of mineralogy and texture (before and after CO2-injection) by optical and scanning electron microscopy (SEM) techniques using thin sections; (ii) adjustment and calibration of DIA tools; (iii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers); (iv) study and quantification by DIA that allow (a) identification and isolation of pixels that belong to the same category: minerals vs. pores in each sample and (b) measurement of changes in pore network, after the samples have been exposed to new conditions (in our case: SC-CO2-injection). Finally, interpretation of the petrography and the measured data by an automated approach were done. In our applied study, the DIA results highlight the changes observed by SEM and microscopic techniques, which consisted in a porosity increase when CO2 treatment occurs. Other additional changes were minor: variations in the roughness and roundness of pore edges, and pore aspect ratio, shown in the bigger pore population. Additionally, statistic tests of pore parameters measured were applied to verify that the differences observed between samples before and after CO2-injection were significant.

  3. Space Environment Automated Alerts and Anomaly Analysis Assistant (SEA^5) for NASA

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a comprehensive analysis and dissemination system (Space Environment Automated Alerts  & Anomaly Analysis Assistant: SEA5) that will...

  4. Automated road marking recognition system

    Science.gov (United States)

    Ziyatdinov, R. R.; Shigabiev, R. R.; Talipov, D. N.

    2017-09-01

    Development of the automated road marking recognition systems in existing and future vehicles control systems is an urgent task. One way to implement such systems is the use of neural networks. To test the possibility of using neural network software has been developed with the use of a single-layer perceptron. The resulting system based on neural network has successfully coped with the task both when driving in the daytime and at night.

  5. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    Tobin, K.W.; Karnowski, T.P.; Gleason, S.S.; Jensen, D.; Lakhani, F.

    1999-01-01

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  6. DESIGN AND ENGINEERING BACKGROUND FOR STATION NETWORKS OF VERTICAL IONOSPHERE SOUNDING

    Directory of Open Access Journals (Sweden)

    A. Y. Grishentsev

    2013-05-01

    Full Text Available The paper deals with analysis of the network stations structure for ionosphere vertical sounding. Design features and creation principle of the program complexes for automated processing, analysis and storage of ionosphere sounding are considered. Conceptual model of complex database control system is created. The results of work are used in research practice of leading national organizations to study the ionosphere. Obtained application results of suggested algorithms and programs for automated processing and analysis of ionosphere vertical sounding are shown.

  7. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  8. Extended -Regular Sequence for Automated Analysis of Microarray Images

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2006-01-01

    Full Text Available Microarray study enables us to obtain hundreds of thousands of expressions of genes or genotypes at once, and it is an indispensable technology for genome research. The first step is the analysis of scanned microarray images. This is the most important procedure for obtaining biologically reliable data. Currently most microarray image processing systems require burdensome manual block/spot indexing work. Since the amount of experimental data is increasing very quickly, automated microarray image analysis software becomes important. In this paper, we propose two automated methods for analyzing microarray images. First, we propose the extended -regular sequence to index blocks and spots, which enables a novel automatic gridding procedure. Second, we provide a methodology, hierarchical metagrid alignment, to allow reliable and efficient batch processing for a set of microarray images. Experimental results show that the proposed methods are more reliable and convenient than the commercial tools.

  9. Digital image analysis applied to industrial nondestructive evaluation and automated parts assembly

    International Nuclear Information System (INIS)

    Janney, D.H.; Kruger, R.P.

    1979-01-01

    Many ideas of image enhancement and analysis are relevant to the needs of the nondestructive testing engineer. These ideas not only aid the engineer in the performance of his current responsibilities, they also open to him new areas of industrial development and automation which are logical extensions of classical testing problems. The paper begins with a tutorial on the fundamentals of computerized image enhancement as applied to nondestructive testing, then progresses through pattern recognition and automated inspection to automated, or robotic, assembly procedures. It is believed that such procedures are cost-effective in many instances, and are but the logical extension of those techniques now commonly used, but often limited to analysis of data from quality-assurance images. Many references are given in order to help the reader who wishes to pursue a given idea further

  10. Automated firewall analytics design, configuration and optimization

    CERN Document Server

    Al-Shaer, Ehab

    2014-01-01

    This book provides a comprehensive and in-depth study of automated firewall policy analysis for designing, configuring and managing distributed firewalls in large-scale enterpriser networks. It presents methodologies, techniques and tools for researchers as well as professionals to understand the challenges and improve the state-of-the-art of managing firewalls systematically in both research and application domains. Chapters explore set-theory, managing firewall configuration globally and consistently, access control list with encryption, and authentication such as IPSec policies. The author

  11. Ecological network analysis for a virtual water network.

    Science.gov (United States)

    Fang, Delin; Chen, Bin

    2015-06-02

    The notions of virtual water flows provide important indicators to manifest the water consumption and allocation between different sectors via product transactions. However, the configuration of virtual water network (VWN) still needs further investigation to identify the water interdependency among different sectors as well as the network efficiency and stability in a socio-economic system. Ecological network analysis is chosen as a useful tool to examine the structure and function of VWN and the interactions among its sectors. A balance analysis of efficiency and redundancy is also conducted to describe the robustness (RVWN) of VWN. Then, network control analysis and network utility analysis are performed to investigate the dominant sectors and pathways for virtual water circulation and the mutual relationships between pairwise sectors. A case study of the Heihe River Basin in China shows that the balance between efficiency and redundancy is situated on the left side of the robustness curve with less efficiency and higher redundancy. The forestation, herding and fishing sectors and industrial sectors are found to be the main controllers. The network tends to be more mutualistic and synergic, though some competitive relationships that weaken the virtual water circulation still exist.

  12. Discrimination between smiling faces: Human observers vs. automated face analysis.

    Science.gov (United States)

    Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo

    2018-05-11

    This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.

  13. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  14. Machine learning for network-based malware detection

    DEFF Research Database (Denmark)

    Stevanovic, Matija

    and based on different, mutually complementary, principles of traffic analysis. The proposed approaches rely on machine learning algorithms (MLAs) for automated and resource-efficient identification of the patterns of malicious network traffic. We evaluated the proposed methods through extensive evaluations...

  15. Automated reasoning applications to design analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Given the necessary relationships and definitions of design functions and components, validation of system incarnation (the physical product of design) and sneak function analysis can be achieved via automated reasoners. The relationships and definitions must define the design specification and incarnation functionally. For the design specification, the hierarchical functional representation is based on physics and engineering principles and bounded by design objectives and constraints. The relationships and definitions of the design incarnation are manifested as element functional definitions, state relationship to functions, functional relationship to direction, element connectivity, and functional hierarchical configuration

  16. NEXCADE: perturbation analysis for complex networks.

    Directory of Open Access Journals (Sweden)

    Gitanjali Yadav

    Full Text Available Recent advances in network theory have led to considerable progress in our understanding of complex real world systems and their behavior in response to external threats or fluctuations. Much of this research has been invigorated by demonstration of the 'robust, yet fragile' nature of cellular and large-scale systems transcending biology, sociology, and ecology, through application of the network theory to diverse interactions observed in nature such as plant-pollinator, seed-dispersal agent and host-parasite relationships. In this work, we report the development of NEXCADE, an automated and interactive program for inducing disturbances into complex systems defined by networks, focusing on the changes in global network topology and connectivity as a function of the perturbation. NEXCADE uses a graph theoretical approach to simulate perturbations in a user-defined manner, singly, in clusters, or sequentially. To demonstrate the promise it holds for broader adoption by the research community, we provide pre-simulated examples from diverse real-world networks including eukaryotic protein-protein interaction networks, fungal biochemical networks, a variety of ecological food webs in nature as well as social networks. NEXCADE not only enables network visualization at every step of the targeted attacks, but also allows risk assessment, i.e. identification of nodes critical for the robustness of the system of interest, in order to devise and implement context-based strategies for restructuring a network, or to achieve resilience against link or node failures. Source code and license for the software, designed to work on a Linux-based operating system (OS can be downloaded at http://www.nipgr.res.in/nexcade_download.html. In addition, we have developed NEXCADE as an OS-independent online web server freely available to the scientific community without any login requirement at http://www.nipgr.res.in/nexcade.html.

  17. First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)

    Science.gov (United States)

    Griffin, Sandy (Editor)

    1987-01-01

    Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.

  18. A fully automated entanglement-based quantum cryptography system for telecom fiber networks

    International Nuclear Information System (INIS)

    Treiber, Alexander; Ferrini, Daniele; Huebel, Hannes; Zeilinger, Anton; Poppe, Andreas; Loruenser, Thomas; Querasser, Edwin; Matyus, Thomas; Hentschel, Michael

    2009-01-01

    We present in this paper a quantum key distribution (QKD) system based on polarization entanglement for use in telecom fibers. A QKD exchange up to 50 km was demonstrated in the laboratory with a secure key rate of 550 bits s -1 . The system is compact and portable with a fully automated start-up, and stabilization modules for polarization, synchronization and photon coupling allow hands-off operation. Stable and reliable key exchange in a deployed optical fiber of 16 km length was demonstrated. In this fiber network, we achieved over 2 weeks an automatic key generation with an average key rate of 2000 bits s -1 without manual intervention. During this period, the system had an average entanglement visibility of 93%, highlighting the technical level and stability achieved for entanglement-based quantum cryptography.

  19. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  20. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  1. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  2. Automated processing of thermal infrared images of Osservatorio Vesuviano permanent surveillance network by using Matlab code

    Science.gov (United States)

    Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa

    2017-04-01

    The permanent thermal infrared surveillance network of Osservatorio Vesuviano (INGV) is composed of 6 stations which acquire IR frames of fumarole fields in the Campi Flegrei caldera and inside the Vesuvius crater (Italy). The IR frames are uploaded to a dedicated server in the Surveillance Center of Osservatorio Vesuviano in order to process the infrared data and to excerpt all the information contained. In a first phase the infrared data are processed by an automated system (A.S.I.R.A. Acq- Automated System of IR Analysis and Acquisition) developed in Matlab environment and with a user-friendly graphic user interface (GUI). ASIRA daily generates time-series of residual temperature values of the maximum temperatures observed in the IR scenes after the removal of seasonal effects. These time-series are displayed in the Surveillance Room of Osservatorio Vesuviano and provide information about the evolution of shallow temperatures field of the observed areas. In particular the features of ASIRA Acq include: a) efficient quality selection of IR scenes, b) IR images co-registration in respect of a reference frame, c) seasonal correction by using a background-removal methodology, a) filing of IR matrices and of the processed data in shared archives accessible to interrogation. The daily archived records can be also processed by ASIRA Plot (Matlab code with GUI) to visualize IR data time-series and to help in evaluating inputs parameters for further data processing and analysis. Additional processing features are accomplished in a second phase by ASIRA Tools which is Matlab code with GUI developed to extract further information from the dataset in automated way. The main functions of ASIRA Tools are: a) the analysis of temperature variations of each pixel of the IR frame in a given time interval, b) the removal of seasonal effects from temperature of every pixel in the IR frames by using an analytic approach (removal of sinusoidal long term seasonal component by using a

  3. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities.

    Science.gov (United States)

    Gomez, Carles; Paradells, Josep

    2015-09-10

    Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost.

  4. Urban Automation Networks: Current and Emerging Solutions for Sensed Data Collection and Actuation in Smart Cities

    Directory of Open Access Journals (Sweden)

    Carles Gomez

    2015-09-01

    Full Text Available Urban Automation Networks (UANs are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost.

  5. Using satellite communications for a mobile computer network

    Science.gov (United States)

    Wyman, Douglas J.

    1993-01-01

    The topics discussed include the following: patrol car automation, mobile computer network, network requirements, network design overview, MCN mobile network software, MCN hub operation, mobile satellite software, hub satellite software, the benefits of patrol car automation, the benefits of satellite mobile computing, and national law enforcement satellite.

  6. Robotics/Automated Systems Task Analysis and Description of Required Job Competencies Report. Task Analysis and Description of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…

  7. Co-creative design developments for accessibility and home automation

    OpenAIRE

    Taib, SM; De Coster, R; Sabri Tekantape, E

    2017-01-01

    The term “Home Automation” can be referred to a networked home, which provides electronically controlled security and convenience for its users. Home automation is also defined as the integration of home-based technology and services for a better quality of living (Quynh, et al., 2012). The main purpose of home automation technologies is to enhance home comfort for everyone through the automation of higher security, domestic tasks and easy communication. Home automation should be able to enha...

  8. An analysis of communications and networking technologies for the smart grid

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Hernandez, Joaquin [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2013-03-01

    The Smart Grid concept has been foreseen as the integration of the electrical generation, transmission and distribution network and the data communications network. Although, traditional communications interfaces, protocols and standards has been used in the electrical grid in an isolated manner, modern communications network is considered as the fundamental enabling technology within the future Smart Grid. Modern communications technologies, protocol architectures and standards can help to build a common communications network infrastructure for data transport between customer premises, power substations, and power distribution systems, utility control centers and utility data centers. The Smart Grid will support traditional applications such as SCADA, distribution automation (DA), energy management systems (EMS), demand site management (DSM) and automatic meter reading (AMR), etc., as well as new applications like advanced metering infrastructure (AMI), substation automation (SA), microgrids, distributed generation, grid monitoring and control, data storage and analysis, among others. To make this possible, the Smart Grid requires a two-way wide area communications network between different dispersed areas, from generation, to distribution to consumer premises. In fact, it will consist of many different types of communications networks such as wide area networks, local area network, home area networks, etc. This requires a new architectural approach to implement a common communications infrastructure that provides the reliability, scalability, security and interoperability to support multiple applications. In addition, open standards addressing interoperability, are key for the development and deployment of the Smart Grid as a true integrated network. A communications backbone is necessary to provide interoperability. To achieve the level of networking, interoperability and security that meets the technical requirements of the Smart Grid, its data communications

  9. D-MSR: A Distributed Network Management Scheme for Real-Time Monitoring and Process Control Applications in Wireless Industrial Automation

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-01-01

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead. PMID:23807687

  10. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation.

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-06-27

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  11. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  12. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may periodically...

  13. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  14. Automated Tracking of Cell Migration with Rapid Data Analysis.

    Science.gov (United States)

    DuChez, Brian J

    2017-09-01

    Cell migration is essential for many biological processes including development, wound healing, and metastasis. However, studying cell migration often requires the time-consuming and labor-intensive task of manually tracking cells. To accelerate the task of obtaining coordinate positions of migrating cells, we have developed a graphical user interface (GUI) capable of automating the tracking of fluorescently labeled nuclei. This GUI provides an intuitive user interface that makes automated tracking accessible to researchers with no image-processing experience or familiarity with particle-tracking approaches. Using this GUI, users can interactively determine a minimum of four parameters to identify fluorescently labeled cells and automate acquisition of cell trajectories. Additional features allow for batch processing of numerous time-lapse images, curation of unwanted tracks, and subsequent statistical analysis of tracked cells. Statistical outputs allow users to evaluate migratory phenotypes, including cell speed, distance, displacement, and persistence, as well as measures of directional movement, such as forward migration index (FMI) and angular displacement. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  15. Automated analysis of small animal PET studies through deformable registration to an atlas

    International Nuclear Information System (INIS)

    Gutierrez, Daniel F.; Zaidi, Habib

    2012-01-01

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  16. Network protocols. Special issue; Netwerkprotocollen. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, G.A. [RTB Van Heugten, Nijmegen (Netherlands); Rooijakkers, G.W.J. [GTI Building Automation, Amsterdam (Netherlands); Peterse, A. [Regel Partners, Hoevelaken (Netherlands); Smits, P. [Konnex Nederland, Valkenswaard (Netherlands); Hamers, E.P. [Van Dorp Installaties, Breda (Netherlands); Van der Velden, J.A.J. [Kropman, Rijswijk (Netherlands); Van Lingen, G.; Wijn, D.M. [Engineer Johnson Controls, Gorinchem (Netherlands); Deckere, W.J.M.A. [Deerns raadgevende ingenieurs, Rijswijk (Netherlands); Driessen, B. [Saia Burgess, Gouda (Netherlands); Van Olst, K. [K en R Consultants, Deventer (Netherlands); Mosterman, F. [Wago Building Technology, Harderwijk (Netherlands); Staub, R. [BUS-House, Zuerich (Switzerland); Meiring, O.B.; Hut, W.H. [Sauter Building Control Nederland, Amsterdam (Netherlands); Tukker, A. [Webeasy Products, Sliedrecht (Netherlands); Bakker, L.G.; Soethout, L.L.; Elkhuizen, P.A. [TNO Bouw en Ondergrond, Delft (Netherlands); Haeseler, U. [TAC GmbH, Berlin (Germany); Kerdel, J.F. [Siemens Building Technologies, Zoetermeer (Netherlands); Lugt, G.L.; Draijer, G.W.

    2007-11-15

    In 20 articles attention is paid to several aspects of network protocols by means of which building automation systems can exchange data: building automation and management, history of technical installations management, the open communication standard BACnet (Building Automation and Control network), the so-called ISO/IEC domotics and communication standard KNX or Konnex, the integration of electrotechnical and engineering installations by the LonWorks technology, other standard protocols as Modbus, M-bus, OPC (OLE for Process Control), an outline of TCP/IP, smart design of networks, automation and networks and building owners, the use of BACnet and Ethernet in a renovated office building, the use of an open management network in buildings, wireless open integrated systems, terminology in network communication, the use of BACnet in combination with KNX, the impact of BACnet on building automation, the role of the installation sector in the ICT-environment, knowledge of building automation and management, regulations with respect to building automation, and BACnet MSTP (Multiple Spanning Tree Protocol) [Dutch] In 20 artikelen wordt in dit themanummer aandacht besteed aan diverse aspecten m.b.t. netwerkprotocollen waarmee verschillende automatiseringssystemen gegevens met elkaar uitwisselen: gebouwautomatisering en beheer, geschiedenis van technisch installatie beheer, de open communicatie standaard BACnet (Building Automation and Control network), de zogenaamde ISO/IEC domotica en communicatie standaard KNX of Konnex, de integratie van electrotechnische en werktuigbouwkundige installaties met behulp van de LonWorks technologie, andere standaard protocollen zoals Modbus, M-bus, OPC (OLE for Process Control), uitleg over TCP/IP, slim ontwerpen van netwerken, gebouweigenaren over automatisering en netwerken, het gebruik van BACnet en Ethernet in een tot kantoorgebouw gerenoveerd monumentaal gebouw, het gebruik van een open management netwerk in gebouwen, draadloos met

  17. Multifractal analysis of complex networks

    International Nuclear Information System (INIS)

    Wang Dan-Ling; Yu Zu-Guo; Anh V

    2012-01-01

    Complex networks have recently attracted much attention in diverse areas of science and technology. Many networks such as the WWW and biological networks are known to display spatial heterogeneity which can be characterized by their fractal dimensions. Multifractal analysis is a useful way to systematically describe the spatial heterogeneity of both theoretical and experimental fractal patterns. In this paper, we introduce a new box-covering algorithm for multifractal analysis of complex networks. This algorithm is used to calculate the generalized fractal dimensions D q of some theoretical networks, namely scale-free networks, small world networks, and random networks, and one kind of real network, namely protein—protein interaction networks of different species. Our numerical results indicate the existence of multifractality in scale-free networks and protein—protein interaction networks, while the multifractal behavior is not clear-cut for small world networks and random networks. The possible variation of D q due to changes in the parameters of the theoretical network models is also discussed. (general)

  18. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  19. Ecological network analysis: network construction

    NARCIS (Netherlands)

    Fath, B.D.; Scharler, U.M.; Ulanowicz, R.E.; Hannon, B.

    2007-01-01

    Ecological network analysis (ENA) is a systems-oriented methodology to analyze within system interactions used to identify holistic properties that are otherwise not evident from the direct observations. Like any analysis technique, the accuracy of the results is as good as the data available, but

  20. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  1. Linking Library Automation Systems in the Internet: Functional Requirements, Planning, and Policy Issues.

    Science.gov (United States)

    Lynch, Clifford A.

    1989-01-01

    This guide to functions to consider in selecting an academic library automation system to operate in a networked environment covers (1) the current academic networking environment; (2) library automation hardware and software platforms; (3) user interface requirements for public access; and (4) security and authentication. (10 references) (MES)

  2. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    Science.gov (United States)

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  3. Automated striatal uptake analysis of 18F-FDOPA PET images applied to Parkinson's disease patients

    International Nuclear Information System (INIS)

    Chang Icheng; Lue Kunhan; Hsieh Hungjen; Liu Shuhsin; Kao, Chinhao K.

    2011-01-01

    6-[ 18 F]Fluoro-L-DOPA (FDOPA) is a radiopharmaceutical valuable for assessing the presynaptic dopaminergic function when used with positron emission tomography (PET). More specifically, the striatal-to-occipital ratio (SOR) of FDOPA uptake images has been extensively used as a quantitative parameter in these PET studies. Our aim was to develop an easy, automated method capable of performing objective analysis of SOR in FDOPA PET images of Parkinson's disease (PD) patients. Brain images from FDOPA PET studies of 21 patients with PD and 6 healthy subjects were included in our automated striatal analyses. Images of each individual were spatially normalized into an FDOPA template. Subsequently, the image slice with the highest level of basal ganglia activity was chosen among the series of normalized images. Also, the immediate preceding and following slices of the chosen image were then selected. Finally, the summation of these three images was used to quantify and calculate the SOR values. The results obtained by automated analysis were compared with manual analysis by a trained and experienced image processing technologist. The SOR values obtained from the automated analysis had a good agreement and high correlation with manual analysis. The differences in caudate, putamen, and striatum were -0.023, -0.029, and -0.025, respectively; correlation coefficients 0.961, 0.957, and 0.972, respectively. We have successfully developed a method for automated striatal uptake analysis of FDOPA PET images. There was no significant difference between the SOR values obtained from this method and using manual analysis. Yet it is an unbiased time-saving and cost-effective program and easy to implement on a personal computer. (author)

  4. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  5. ROBOCAL: An automated NDA [nondestructive analysis] calorimetry and gamma isotopic system

    International Nuclear Information System (INIS)

    Hurd, J.R.; Powell, W.D.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototype robotic system for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multidrawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface is provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric and gamma-ray data acquisition and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices

  6. Performance of wavelet analysis and neural networks for pathological voices identification

    Science.gov (United States)

    Salhi, Lotfi; Talbi, Mourad; Abid, Sabeur; Cherif, Adnane

    2011-09-01

    Within the medical environment, diverse techniques exist to assess the state of the voice of the patient. The inspection technique is inconvenient for a number of reasons, such as its high cost, the duration of the inspection, and above all, the fact that it is an invasive technique. This study focuses on a robust, rapid and accurate system for automatic identification of pathological voices. This system employs non-invasive, non-expensive and fully automated method based on hybrid approach: wavelet transform analysis and neural network classifier. First, we present the results obtained in our previous study while using classic feature parameters. These results allow visual identification of pathological voices. Second, quantified parameters drifting from the wavelet analysis are proposed to characterise the speech sample. On the other hand, a system of multilayer neural networks (MNNs) has been developed which carries out the automatic detection of pathological voices. The developed method was evaluated using voice database composed of recorded voice samples (continuous speech) from normophonic or dysphonic speakers. The dysphonic speakers were patients of a National Hospital 'RABTA' of Tunis Tunisia and a University Hospital in Brussels, Belgium. Experimental results indicate a success rate ranging between 75% and 98.61% for discrimination of normal and pathological voices using the proposed parameters and neural network classifier. We also compared the average classification rate based on the MNN, Gaussian mixture model and support vector machines.

  7. Connecting Land-Based Networks to Ships

    Science.gov (United States)

    2013-06-01

    multipoint wireless broadband systems, and WiMAX networks were initially deployed for fixed and nomadic (portable) applications. These standards...CAPABILITIES OF SHIP-TO-SHORE COMMUNICATIONS A. US Navy Automated Digital Network System (ADNS) The U.S. Navy’s Automated Digital Network System (ADNS...submit digitally any necessary documents to the terminal operators, contact their logistics providers, access tidal information and receive

  8. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  9. FULLY AUTOMATED IMAGE ORIENTATION IN THE ABSENCE OF TARGETS

    Directory of Open Access Journals (Sweden)

    C. Stamatopoulos

    2012-07-01

    Full Text Available Automated close-range photogrammetric network orientation has traditionally been associated with the use of coded targets in the object space to allow for an initial relative orientation (RO and subsequent spatial resection of the images. Over the past decade, automated orientation via feature-based matching (FBM techniques has attracted renewed research attention in both the photogrammetry and computer vision (CV communities. This is largely due to advances made towards the goal of automated relative orientation of multi-image networks covering untargetted (markerless objects. There are now a number of CV-based algorithms, with accompanying open-source software, that can achieve multi-image orientation within narrow-baseline networks. From a photogrammetric standpoint, the results are typically disappointing as the metric integrity of the resulting models is generally poor, or even unknown, while the number of outliers within the image matching and triangulation is large, and generally too large to allow relative orientation (RO via the commonly used coplanarity equations. On the other hand, there are few examples within the photogrammetric research field of automated markerless camera calibration to metric tolerances, and these too are restricted to narrow-baseline, low-convergence imaging geometry. The objective addressed in this paper is markerless automatic multi-image orientation, maintaining metric integrity, within networks that incorporate wide-baseline imagery. By wide-baseline we imply convergent multi-image configurations with convergence angles of up to around 90°. An associated aim is provision of a fast, fully automated process, which can be performed without user intervention. For this purpose, various algorithms require optimisation to allow parallel processing utilising multiple PC cores and graphics processing units (GPUs.

  10. MOST-visualization: software for producing automated textbook-style maps of genome-scale metabolic networks.

    Science.gov (United States)

    Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S

    2017-08-15

    Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  11. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  12. IMAGE CONSTRUCTION TO AUTOMATION OF PROJECTIVE TECHNIQUES FOR PSYCHOPHYSIOLOGICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Natalia Pavlova

    2018-04-01

    Full Text Available The search for a solution of automation of the process of assessment of a psychological analysis of the person drawings created by it from an available set of some templates are presented at this article. It will allow to reveal more effectively infringements of persons mentality. In particular, such decision can be used for work with children who possess the developed figurative thinking, but are not yet capable of an accurate statement of the thoughts and experiences. For automation of testing by using a projective method, we construct interactive environment for visualization of compositions of the several images and then analyse

  13. The control network of air quality in the Lorraine steel industry country: an example of a specific steel industry network

    International Nuclear Information System (INIS)

    Poncin, G.

    1991-01-01

    This specific (for steel industry region) network for the air quality control mainly measures the concentrations in sulfur dioxide, airborne dust and fall out particles. The recent automation of this network implied a preliminary optimization study which consisted of a statistical analysis of the numerous data collected by many hand operated sensors. The implementation and working conditions of the new equipment have required the use of air-conditioned monoblock metallic cabins

  14. Network Analysis, Architecture, and Design

    CERN Document Server

    McCabe, James D

    2007-01-01

    Traditionally, networking has had little or no basis in analysis or architectural development, with designers relying on technologies they are most familiar with or being influenced by vendors or consultants. However, the landscape of networking has changed so that network services have now become one of the most important factors to the success of many third generation networks. It has become an important feature of the designer's job to define the problems that exist in his network, choose and analyze several optimization parameters during the analysis process, and then prioritize and evalua

  15. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that the system can detect the misbehaving parties who caused that failure. Accountability is an intuitively stronger property than verifiability as the latter only rests on the possibility of detecting the failure of a goal. A plethora of accountability and verifiability definitions have been proposed...... in the literature. Those definitions are either very specific to the protocols in question, hence not applicable in other scenarios, or too general and widely applicable but requiring complicated and hard to follow manual proofs. In this paper, we advance formal definitions of verifiability and accountability...... that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...

  16. Automated Classification and Analysis of Non-metallic Inclusion Data Sets

    Science.gov (United States)

    Abdulsalam, Mohammad; Zhang, Tongsheng; Tan, Jia; Webler, Bryan A.

    2018-05-01

    The aim of this study is to utilize principal component analysis (PCA), clustering methods, and correlation analysis to condense and examine large, multivariate data sets produced from automated analysis of non-metallic inclusions. Non-metallic inclusions play a major role in defining the properties of steel and their examination has been greatly aided by automated analysis in scanning electron microscopes equipped with energy dispersive X-ray spectroscopy. The methods were applied to analyze inclusions on two sets of samples: two laboratory-scale samples and four industrial samples from a near-finished 4140 alloy steel components with varying machinability. The laboratory samples had well-defined inclusions chemistries, composed of MgO-Al2O3-CaO, spinel (MgO-Al2O3), and calcium aluminate inclusions. The industrial samples contained MnS inclusions as well as (Ca,Mn)S + calcium aluminate oxide inclusions. PCA could be used to reduce inclusion chemistry variables to a 2D plot, which revealed inclusion chemistry groupings in the samples. Clustering methods were used to automatically classify inclusion chemistry measurements into groups, i.e., no user-defined rules were required.

  17. Development of automated system based on neural network algorithm for detecting defects on molds installed on casting machines

    Science.gov (United States)

    Bazhin, V. Yu; Danilov, I. V.; Petrov, P. A.

    2018-05-01

    During the casting of light alloys and ligatures based on aluminum and magnesium, problems of the qualitative distribution of the metal and its crystallization in the mold arise. To monitor the defects of molds on the casting conveyor, a camera with a resolution of 780 x 580 pixels and a shooting rate of 75 frames per second was selected. Images of molds from casting machines were used as input data for neural network algorithm. On the preparation of a digital database and its analytical evaluation stage, the architecture of the convolutional neural network was chosen for the algorithm. The information flow from the local controller is transferred to the OPC server and then to the SCADA system of foundry. After the training, accuracy of neural network defect recognition was about 95.1% on a validation split. After the training, weight coefficients of the neural network were used on testing split and algorithm had identical accuracy with validation images. The proposed technical solutions make it possible to increase the efficiency of the automated process control system in the foundry by expanding the digital database.

  18. Review Essay: Does Qualitative Network Analysis Exist?

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2007-01-01

    Full Text Available Social network analysis was formed and established in the 1970s as a way of analyzing systems of social relations. In this review the theoretical-methodological standpoint of social network analysis ("structural analysis" is introduced and the different forms of social network analysis are presented. Structural analysis argues that social actors and social relations are embedded in social networks, meaning that action and perception of actors as well as the performance of social relations are influenced by the network structure. Since the 1990s structural analysis has integrated concepts such as agency, discourse and symbolic orientation and in this way structural analysis has opened itself. Since then there has been increasing use of qualitative methods in network analysis. They are used to include the perspective of the analyzed actors, to explore networks, and to understand network dynamics. In the reviewed book, edited by Betina HOLLSTEIN and Florian STRAUS, the twenty predominantly empirically orientated contributions demonstrate the possibilities of combining quantitative and qualitative methods in network analyses in different research fields. In this review we examine how the contributions succeed in applying and developing the structural analysis perspective, and the self-positioning of "qualitative network analysis" is evaluated. URN: urn:nbn:de:0114-fqs0701287

  19. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  20. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  1. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    Directory of Open Access Journals (Sweden)

    Kevin A. Huck

    2008-01-01

    Full Text Available The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis of individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.

  2. Analysis of Semantic Networks using Complex Networks Concepts

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2013-01-01

    In this paper we perform a preliminary analysis of semantic networks to determine the most important terms that could be used to optimize a summarization task. In our experiments, we measure how the properties of a semantic network change, when the terms in the network are removed. Our preliminar...

  3. Co-trained convolutional neural networks for automated detection of prostate cancer in multi-parametric MRI.

    Science.gov (United States)

    Yang, Xin; Liu, Chaoyue; Wang, Zhiwei; Yang, Jun; Min, Hung Le; Wang, Liang; Cheng, Kwang-Ting Tim

    2017-12-01

    Multi-parameter magnetic resonance imaging (mp-MRI) is increasingly popular for prostate cancer (PCa) detection and diagnosis. However, interpreting mp-MRI data which typically contains multiple unregistered 3D sequences, e.g. apparent diffusion coefficient (ADC) and T2-weighted (T2w) images, is time-consuming and demands special expertise, limiting its usage for large-scale PCa screening. Therefore, solutions to computer-aided detection of PCa in mp-MRI images are highly desirable. Most recent advances in automated methods for PCa detection employ a handcrafted feature based two-stage classification flow, i.e. voxel-level classification followed by a region-level classification. This work presents an automated PCa detection system which can concurrently identify the presence of PCa in an image and localize lesions based on deep convolutional neural network (CNN) features and a single-stage SVM classifier. Specifically, the developed co-trained CNNs consist of two parallel convolutional networks for ADC and T2w images respectively. Each network is trained using images of a single modality in a weakly-supervised manner by providing a set of prostate images with image-level labels indicating only the presence of PCa without priors of lesions' locations. Discriminative visual patterns of lesions can be learned effectively from clutters of prostate and surrounding tissues. A cancer response map with each pixel indicating the likelihood to be cancerous is explicitly generated at the last convolutional layer of the network for each modality. A new back-propagated error E is defined to enforce both optimized classification results and consistent cancer response maps for different modalities, which help capture highly representative PCa-relevant features during the CNN feature learning process. The CNN features of each modality are concatenated and fed into a SVM classifier. For images which are classified to contain cancers, non-maximum suppression and adaptive

  4. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    Science.gov (United States)

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles.

  5. Performance of an Artificial Multi-observer Deep Neural Network for Fully Automated Segmentation of Polycystic Kidneys.

    Science.gov (United States)

    Kline, Timothy L; Korfiatis, Panagiotis; Edwards, Marie E; Blais, Jaime D; Czerwiec, Frank S; Harris, Peter C; King, Bernard F; Torres, Vicente E; Erickson, Bradley J

    2017-08-01

    Deep learning techniques are being rapidly applied to medical imaging tasks-from organ and lesion segmentation to tissue and tumor classification. These techniques are becoming the leading algorithmic approaches to solve inherently difficult image processing tasks. Currently, the most critical requirement for successful implementation lies in the need for relatively large datasets that can be used for training the deep learning networks. Based on our initial studies of MR imaging examinations of the kidneys of patients affected by polycystic kidney disease (PKD), we have generated a unique database of imaging data and corresponding reference standard segmentations of polycystic kidneys. In the study of PKD, segmentation of the kidneys is needed in order to measure total kidney volume (TKV). Automated methods to segment the kidneys and measure TKV are needed to increase measurement throughput and alleviate the inherent variability of human-derived measurements. We hypothesize that deep learning techniques can be leveraged to perform fast, accurate, reproducible, and fully automated segmentation of polycystic kidneys. Here, we describe a fully automated approach for segmenting PKD kidneys within MR images that simulates a multi-observer approach in order to create an accurate and robust method for the task of segmentation and computation of TKV for PKD patients. A total of 2000 cases were used for training and validation, and 400 cases were used for testing. The multi-observer ensemble method had mean ± SD percent volume difference of 0.68 ± 2.2% compared with the reference standard segmentations. The complete framework performs fully automated segmentation at a level comparable with interobserver variability and could be considered as a replacement for the task of segmentation of PKD kidneys by a human.

  6. Security-Enhanced Autonomous Network Management

    Science.gov (United States)

    Zeng, Hui

    2015-01-01

    Ensuring reliable communication in next-generation space networks requires a novel network management system to support greater levels of autonomy and greater awareness of the environment and assets. Intelligent Automation, Inc., has developed a security-enhanced autonomous network management (SEANM) approach for space networks through cross-layer negotiation and network monitoring, analysis, and adaptation. The underlying technology is bundle-based delay/disruption-tolerant networking (DTN). The SEANM scheme allows a system to adaptively reconfigure its network elements based on awareness of network conditions, policies, and mission requirements. Although SEANM is generically applicable to any radio network, for validation purposes it has been prototyped and evaluated on two specific networks: a commercial off-the-shelf hardware test-bed using Institute of Electrical Engineers (IEEE) 802.11 Wi-Fi devices and a military hardware test-bed using AN/PRC-154 Rifleman Radio platforms. Testing has demonstrated that SEANM provides autonomous network management resulting in reliable communications in delay/disruptive-prone environments.

  7. Applications of the soft computing in the automated history matching

    Energy Technology Data Exchange (ETDEWEB)

    Silva, P.C.; Maschio, C.; Schiozer, D.J. [Unicamp (Brazil)

    2006-07-01

    Reservoir management is a research field in petroleum engineering that optimizes reservoir performance based on environmental, political, economic and technological criteria. Reservoir simulation is based on geological models that simulate fluid flow. Models must be constantly corrected to yield the observed production behaviour. The process of history matching is controlled by the comparison of production data, well test data and measured data from simulations. Parametrization, objective function analysis, sensitivity analysis and uncertainty analysis are important steps in history matching. One of the main challenges facing automated history matching is to develop algorithms that find the optimal solution in multidimensional search spaces. Optimization algorithms can be either global optimizers that work with noisy multi-modal functions, or local optimizers that cannot work with noisy multi-modal functions. The problem with global optimizers is the very large number of function calls, which is an inconvenience due to the long reservoir simulation time. For that reason, techniques such as least squared, thin plane spline, kriging and artificial neural networks (ANN) have been used as substitutes to reservoir simulators. This paper described the use of optimization algorithms to find optimal solution in automated history matching. Several ANN were used, including the generalized regression neural network, fuzzy system with subtractive clustering and radial basis network. The UNIPAR soft computing method was used along with a modified Hooke- Jeeves optimization method. Two case studies with synthetic and real reservoirs are examined. It was concluded that the combination of global and local optimization has the potential to improve the history matching process and that the use of substitute models can reduce computational efforts. 15 refs., 11 figs.

  8. An Improved Tarpit for Network Deception

    Science.gov (United States)

    2016-03-25

    pollute network measurement studies, as well as note the negative impact that even small blocks of tarpit address spaces have on automated scanners...to ping Greasy and LaBrea hosts: a Mac OS X Version 10.10.5 machine on a home residential net- work in California, the CentOS Linux release 7.2.1511...attack analysis and democracy,” 2010. [18] C. Ruvalcaba, “ Smart IDS – Hybrid LaBrea Tarpit,” SANS Institute, Report, 2009. [19] V. Oppleman, “Network

  9. Semi-automated digital image analysis of patellofemoral joint space width from lateral knee radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Grochowski, S.J. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Amrami, K.K. [Mayo Clinic, Department of Radiology, Rochester (United States); Kaufman, K. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Mayo Clinic/Foundation, Biomechanics Laboratory, Department of Orthopedic Surgery, Charlton North L-110L, Rochester (United States)

    2005-10-01

    To design a semi-automated program to measure minimum patellofemoral joint space width (JSW) using standing lateral view radiographs. Lateral patellofemoral knee radiographs were obtained from 35 asymptomatic subjects. The radiographs were analyzed to report both the repeatability of the image analysis program and the reproducibility of JSW measurements within a 2 week period. The results were also compared with manual measurements done by an experienced musculoskeletal radiologist. The image analysis program was shown to have an excellent coefficient of repeatability of 0.18 and 0.23 mm for intra- and inter-observer measurements respectively. The manual method measured a greater minimum JSW than the automated method. Reproducibility between days was comparable to other published results, but was less satisfactory for both manual and semi-automated measurements. The image analysis program had an inter-day coefficient of repeatability of 1.24 mm, which was lower than 1.66 mm for the manual method. A repeatable semi-automated method for measurement of the patellofemoral JSW from radiographs has been developed. The method is more accurate than manual measurements. However, the between-day reproducibility is higher than the intra-day reproducibility. Further investigation of the protocol for obtaining sequential lateral knee radiographs is needed in order to reduce the between-day variability. (orig.)

  10. Artificial Neural Network Analysis System

    Science.gov (United States)

    2001-02-27

    Contract No. DASG60-00-M-0201 Purchase request no.: Foot in the Door-01 Title Name: Artificial Neural Network Analysis System Company: Atlantic... Artificial Neural Network Analysis System 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Powell, Bruce C 5d. PROJECT NUMBER 5e. TASK NUMBER...34) 27-02-2001 Report Type N/A Dates Covered (from... to) ("DD MON YYYY") 28-10-2000 27-02-2001 Title and Subtitle Artificial Neural Network Analysis

  11. Evaluation of an Automated Analysis Tool for Prostate Cancer Prediction Using Multiparametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Matthias C Roethke

    Full Text Available To evaluate the diagnostic performance of an automated analysis tool for the assessment of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI of the prostate.A fully automated analysis tool was used for a retrospective analysis of mpMRI sets (T2-weighted, T1-weighted dynamic contrast-enhanced, and diffusion-weighted sequences. The software provided a malignancy prediction value for each image pixel, defined as Malignancy Attention Index (MAI that can be depicted as a colour map overlay on the original images. The malignancy maps were compared to histopathology derived from a combination of MRI-targeted and systematic transperineal MRI/TRUS-fusion biopsies.In total, mpMRI data of 45 patients were evaluated. With a sensitivity of 85.7% (with 95% CI of 65.4-95.0, a specificity of 87.5% (with 95% CI of 69.0-95.7 and a diagnostic accuracy of 86.7% (with 95% CI of 73.8-93.8 for detection of prostate cancer, the automated analysis results corresponded well with the reported diagnostic accuracies by human readers based on the PI-RADS system in the current literature.The study revealed comparable diagnostic accuracies for the detection of prostate cancer of a user-independent MAI-based automated analysis tool and PI-RADS-scoring-based human reader analysis of mpMRI. Thus, the analysis tool could serve as a detection support system for less experienced readers. The results of the study also suggest the potential of MAI-based analysis for advanced lesion assessments, such as cancer extent and staging prediction.

  12. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics

    Science.gov (United States)

    2017-01-01

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473

  13. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.

    Science.gov (United States)

    Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier

    2017-10-21

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.

  14. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  15. Automated retroillumination photography analysis for objective assessment of Fuchs Corneal Dystrophy severity

    Science.gov (United States)

    Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.

    2016-01-01

    Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565

  16. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...... illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources and information, and the exercise of power. The paper then examines some of the methodological challenges of social network analysis and how it can be combined with other...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies....

  17. Automated conflict resolution issues

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  18. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    Directory of Open Access Journals (Sweden)

    Kim Hyun

    2011-12-01

    Full Text Available Abstract Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  19. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network.

    Science.gov (United States)

    Kim, Hyun Uk; Kim, Tae Yong; Lee, Sang Yup

    2011-01-01

    Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism's metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  20. AMDA: an R package for the automated microarray data analysis

    Directory of Open Access Journals (Sweden)

    Foti Maria

    2006-07-01

    Full Text Available Abstract Background Microarrays are routinely used to assess mRNA transcript levels on a genome-wide scale. Large amount of microarray datasets are now available in several databases, and new experiments are constantly being performed. In spite of this fact, few and limited tools exist for quickly and easily analyzing the results. Microarray analysis can be challenging for researchers without the necessary training and it can be time-consuming for service providers with many users. Results To address these problems we have developed an automated microarray data analysis (AMDA software, which provides scientists with an easy and integrated system for the analysis of Affymetrix microarray experiments. AMDA is free and it is available as an R package. It is based on the Bioconductor project that provides a number of powerful bioinformatics and microarray analysis tools. This automated pipeline integrates different functions available in the R and Bioconductor projects with newly developed functions. AMDA covers all of the steps, performing a full data analysis, including image analysis, quality controls, normalization, selection of differentially expressed genes, clustering, correspondence analysis and functional evaluation. Finally a LaTEX document is dynamically generated depending on the performed analysis steps. The generated report contains comments and analysis results as well as the references to several files for a deeper investigation. Conclusion AMDA is freely available as an R package under the GPL license. The package as well as an example analysis report can be downloaded in the Services/Bioinformatics section of the Genopolis http://www.genopolis.it/

  1. Automated magnification calibration in transmission electron microscopy using Fourier analysis of replica images

    International Nuclear Information System (INIS)

    Laak, Jeroen A.W.M. van der; Dijkman, Henry B.P.M.; Pahlplatz, Martin M.M.

    2006-01-01

    The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000x to 200,000x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy

  2. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  3. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  4. Flexible automated systems of real time mining operation management: concepts, architecture, models of network engineering for data transmission and processing

    Energy Technology Data Exchange (ETDEWEB)

    Markhasin, A.B.

    1987-11-01

    Since the mid 1960's considerable effort has been invested by the mining industry and its research institutions and by universities to create real time mining management automation systems. Some of the shortcomings which still persist in realizing the efficiency such systems can offer are due to objective and subjective factors within and outside the management systems: the creation of the component base, automation equipment, and computer technology, on the one hand, and the organization, process, engineering, and coordination of mining work on the other. This review addresses several of these shortcomings with recommendations for their solution in a primary and systematic way and suggests methods for the implementation of microprocessors and a network of flexible data transmission and processing facilities for both surface and underground mining.

  5. Automated three-dimensional X-ray analysis using a dual-beam FIB

    International Nuclear Information System (INIS)

    Schaffer, Miroslava; Wagner, Julian; Schaffer, Bernhard; Schmied, Mario; Mulders, Hans

    2007-01-01

    We present a fully automated method for three-dimensional (3D) elemental analysis demonstrated using a ceramic sample of chemistry (Ca)MgTiO x . The specimen is serially sectioned by a focused ion beam (FIB) microscope, and energy-dispersive X-ray spectrometry (EDXS) is used for elemental analysis of each cross-section created. A 3D elemental model is reconstructed from the stack of two-dimensional (2D) data. This work concentrates on issues arising from process automation, the large sample volume of approximately 17x17x10 μm 3 , and the insulating nature of the specimen. A new routine for post-acquisition data correction of different drift effects is demonstrated. Furthermore, it is shown that EDXS data may be erroneous for specimens containing voids, and that back-scattered electron images have to be used to correct for these errors

  6. Analysis of Recurrent Analog Neural Networks

    Directory of Open Access Journals (Sweden)

    Z. Raida

    1998-06-01

    Full Text Available In this paper, an original rigorous analysis of recurrent analog neural networks, which are built from opamp neurons, is presented. The analysis, which comes from the approximate model of the operational amplifier, reveals causes of possible non-stable states and enables to determine convergence properties of the network. Results of the analysis are discussed in order to enable development of original robust and fast analog networks. In the analysis, the special attention is turned to the examination of the influence of real circuit elements and of the statistical parameters of processed signals to the parameters of the network.

  7. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  8. Web-based automation of green building rating index and life cycle cost analysis

    Science.gov (United States)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  9. Ecological Automation Design, Extending Work Domain Analysis

    NARCIS (Netherlands)

    Amelink, M.H.J.

    2010-01-01

    In high–risk domains like aviation, medicine and nuclear power plant control, automation has enabled new capabilities, increased the economy of operation and has greatly contributed to safety. However, automation increases the number of couplings in a system, which can inadvertently lead to more

  10. Data Distribution Service for Industrial Automation

    OpenAIRE

    Yang, Jinsong

    2012-01-01

    In industrial automation systems, there is usually large volume of data which needs to be delivered to right places at the right time. In addition, large number of nodes in the automation systems are usually distributed which increases the complexity that there needs to be more point-to-point Ethernet-connections in the network. Hence, it is necessary to apply data-centric design and reduce the connection complexity. Data Distributed Service for Real-Time Systems (DDS) is a data-centric middl...

  11. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    Science.gov (United States)

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  12. An automated solution enrichment system for uranium analysis

    International Nuclear Information System (INIS)

    Jones, S.A.; Sparks, R.; Sampson, T.; Parker, J.; Horley, E.; Kelly, T.

    1993-01-01

    An automated Solution Enrichment system (SES) for analysis of Uranium and U-235 isotopes in process samples has been developed through a joint effort between Los Alamos National Laboratory and Martin Marietta Energy systems, Portsmouth Gaseous Diffusion Plant. This device features an advanced robotics system which in conjuction with stabilized passive gamma-ray and X-ray fluorescence detectors provides for rapid, non-destructive analyses of process samples for improved special nuclear material accountability and process control

  13. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  14. Completely automated modal analysis procedure based on the combination of different OMA methods

    Science.gov (United States)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  15. User needs and home automation infrastructures in industrialized and developing countries

    NARCIS (Netherlands)

    Brink, M.; Bronswijk, van J.E.M.H.

    2010-01-01

    Purpose Home automation systems are meant to improve the performance of a dwelling. Middleware, the ICT infrastructure of a home automation system, connects the different applications (lighting, heating, smoke sensing, etc) by providing the network and defining the technologies used for

  16. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    Energy Technology Data Exchange (ETDEWEB)

    Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McBay, Eddy H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-30

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oak Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.

  17. Automated Measurement and Signaling Systems for the Transactional Network

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Brown, Richard; Price, Phillip; Page, Janie; Granderson, Jessica; Riess, David; Czarnecki, Stephen; Ghatikar, Girish; Lanzisera, Steven

    2013-12-31

    The Transactional Network Project is a multi-lab activity funded by the US Department of Energy?s Building Technologies Office. The project team included staff from Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory and Oak Ridge National Laboratory. The team designed, prototyped and tested a transactional network (TN) platform to support energy, operational and financial transactions between any networked entities (equipment, organizations, buildings, grid, etc.). PNNL was responsible for the development of the TN platform, with agents for this platform developed by each of the three labs. LBNL contributed applications to measure the whole-building electric load response to various changes in building operations, particularly energy efficiency improvements and demand response events. We also provide a demand response signaling agent and an agent for cost savings analysis. LBNL and PNNL demonstrated actual transactions between packaged rooftop units and the electric grid using the platform and selected agents. This document describes the agents and applications developed by the LBNL team, and associated tests of the applications.

  18. SAMPO 90 - High resolution interactive gamma spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1991-01-01

    SAMPO 90 is a high performance gamma spectrum analysis program for personal computers. It uses high resolution color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or by using macros for automated measurement and analysis sequences including the control of MCAs and sample changers. Semi-automated calibrations for peak shapes (Gaussian with exponential tails), detector efficiency, and energy are available with a possibility for user intervention through interactive graphics. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear, non-linear and mixed mode fitting, where the component energies and areas can be either frozen or allowed to float in arbitrary combinations. Nuclide identification is done using associated lines techniques which allow interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. Attenuation corrections can be taken into account in detector efficiency calculation. The most common PC-based MCA spectrum formats (Canberra S100, Ortec ACE, Nucleus PCA, ND AccuSpec) are supported as well as ASCII spectrum files. A gamma-line library is included together with an editor for user configurable libraries. The analysis reports and program parameters are fully customizable. Function key macros can be used to automate the most common analysis procedures. Small batch type modules are additionally available for routine work. SAMPO 90 is a result of over twenty man years of programming and contains 25,000 lines of Fortran, 10,000 lines of C, and 12,000 lines of assembler

  19. Automated misfire diagnosis in engines using torsional vibration and block rotation

    International Nuclear Information System (INIS)

    Chen, J; Randall, R B; Peeters, B; Auweraer, H Van der; Desmet, W

    2012-01-01

    Even though a lot of research has gone into diagnosing misfire in IC engines, most approaches use torsional vibration of the crankshaft, and only a few use the rocking motion (roll) of the engine block. Additionally, misfire diagnosis normally requires an expert to interpret the analysis results from measured vibration signals. Artificial Neural Networks (ANNs) are potential tools for the automated misfire diagnosis of IC engines, as they can learn the patterns corresponding to various faults. This paper proposes an ANN-based automated diagnostic system which combines torsional vibration and rotation of the block for more robust misfire diagnosis. A critical issue with ANN applications is the network training, and it is improbable and/or uneconomical to expect to experience a sufficient number of different faults, or generate them in seeded tests, to obtain sufficient experimental results for the network training. Therefore, new simulation models, which can simulate combustion faults in engines, were developed. The simulation models are based on the thermodynamic and mechanical principles of IC engines and therefore the proposed misfire diagnostic system can in principle be adapted for any engine. During the building process of the models, based on a particular engine, some mechanical and physical parameters, for example the inertial properties of the engine parts and parameters of engine mounts, were first measured and calculated. A series of experiments were then carried out to capture the vibration signals for both normal condition and with a range of faults. The simulation models were updated and evaluated by the experimental results. Following the signal processing of the experimental and simulation signals, the best features were selected as the inputs to ANN networks. The automated diagnostic system comprises three stages: misfire detection, misfire localization and severity identification. Multi-layer Perceptron (MLP) and Probabilistic Neural Networks were

  20. Test and Evaluation of a Prototyped Sensor-Camera Network for Persistent Intelligence, Surveillance, and Reconnaissance in Support of Tactical Coalition Networking Environments

    Science.gov (United States)

    2006-06-01

    networks is home automation . Wireless sensor networks can be employed in a home environment similar to the ways they are deployed in environmental...and industrial settings. Home automation provides increased control of home appliances and security. Climate control and security systems are the...most common types of home automation applications. However, as technology 12 has increased, new applications are emerging. For example

  1. An investigation and comparison on network performance analysis

    OpenAIRE

    Lanxiaopu, Mi

    2012-01-01

    This thesis is generally about network performance analysis. It contains two parts. The theory part summarizes what network performance is and inducts the methods of doing network performance analysis. To answer what network performance is, a study into what network services are is done. And based on the background research, there are two important network performance metrics: Network delay and Throughput should be included in network performance analysis. Among the methods of network a...

  2. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  3. Semi-automated volumetric analysis of artificial lymph nodes in a phantom study

    International Nuclear Information System (INIS)

    Fabel, M.; Biederer, J.; Jochens, A.; Bornemann, L.; Soza, G.; Heller, M.; Bolte, H.

    2011-01-01

    Purpose: Quantification of tumour burden in oncology requires accurate and reproducible image evaluation. The current standard is one-dimensional measurement (e.g. RECIST) with inherent disadvantages. Volumetric analysis is discussed as an alternative for therapy monitoring of lung and liver metastases. The aim of this study was to investigate the accuracy of semi-automated volumetric analysis of artificial lymph node metastases in a phantom study. Materials and methods: Fifty artificial lymph nodes were produced in a size range from 10 to 55 mm; some of them enhanced using iodine contrast media. All nodules were placed in an artificial chest phantom (artiCHEST ® ) within different surrounding tissues. MDCT was performed using different collimations (1–5 mm) at varying reconstruction kernels (B20f, B40f, B60f). Volume and RECIST measurements were performed using Oncology Software (Siemens Healthcare, Forchheim, Germany) and were compared to reference volume and diameter by calculating absolute percentage errors. Results: The software performance allowed a robust volumetric analysis in a phantom setting. Unsatisfying segmentation results were frequently found for native nodules within surrounding muscle. The absolute percentage error (APE) for volumetric analysis varied between 0.01 and 225%. No significant differences were seen between different reconstruction kernels. The most unsatisfactory segmentation results occurred in higher slice thickness (4 and 5 mm). Contrast enhanced lymph nodes showed better segmentation results by trend. Conclusion: The semi-automated 3D-volumetric analysis software tool allows a reliable and convenient segmentation of artificial lymph nodes in a phantom setting. Lymph nodes adjacent to tissue of similar density cause segmentation problems. For volumetric analysis of lymph node metastases in clinical routine a slice thickness of ≤3 mm and a medium soft reconstruction kernel (e.g. B40f for Siemens scan systems) may be a suitable

  4. Automated visual inspection system based on HAVNET architecture

    Science.gov (United States)

    Burkett, K.; Ozbayoglu, Murat A.; Dagli, Cihan H.

    1994-10-01

    In this study, the HAusdorff-Voronoi NETwork (HAVNET) developed at the UMR Smart Engineering Systems Lab is tested in the recognition of mounted circuit components commonly used in printed circuit board assembly systems. The automated visual inspection system used consists of a CCD camera, a neural network based image processing software and a data acquisition card connected to a PC. The experiments are run in the Smart Engineering Systems Lab in the Engineering Management Dept. of the University of Missouri-Rolla. The performance analysis shows that the vision system is capable of recognizing different components under uncontrolled lighting conditions without being effected by rotation or scale differences. The results obtained are promising and the system can be used in real manufacturing environments. Currently the system is being customized for a specific manufacturing application.

  5. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  6. Northern emporia and maritime networks. Modelling past communication using archaeological network analysis

    DEFF Research Database (Denmark)

    Sindbæk, Søren Michael

    2015-01-01

    preserve patterns of thisinteraction. Formal network analysis and modelling holds the potential to identify anddemonstrate such patterns, where traditional methods often prove inadequate. Thearchaeological study of communication networks in the past, however, calls for radically different analytical...... this is not a problem of network analysis, but network synthesis: theclassic problem of cracking codes or reconstructing black-box circuits. It is proposedthat archaeological approaches to network synthesis must involve a contextualreading of network data: observations arising from individual contexts, morphologies...

  7. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  8. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  9. Automated analysis of instructional text

    Energy Technology Data Exchange (ETDEWEB)

    Norton, L.M.

    1983-05-01

    The development of a capability for automated processing of natural language text is a long-range goal of artificial intelligence. This paper discusses an investigation into the issues involved in the comprehension of descriptive, as opposed to illustrative, textual material. The comprehension process is viewed as the conversion of knowledge from one representation into another. The proposed target representation consists of statements of the prolog language, which can be interpreted both declaratively and procedurally, much like production rules. A computer program has been written to model in detail some ideas about this process. The program successfully analyzes several heavily edited paragraphs adapted from an elementary textbook on programming, automatically synthesizing as a result of the analysis a working Prolog program which, when executed, can parse and interpret let commands in the basic language. The paper discusses the motivations and philosophy of the project, the many kinds of prerequisite knowledge which are necessary, and the structure of the text analysis program. A sentence-by-sentence account of the analysis of the sample text is presented, describing the syntactic and semantic processing which is involved. The paper closes with a discussion of lessons learned from the project, possible alternative approaches, and possible extensions for future work. The entire project is presented as illustrative of the nature and complexity of the text analysis process, rather than as providing definitive or optimal solutions to any aspects of the task. 12 references.

  10. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    Science.gov (United States)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  11. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation.   Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  12. 2013 Chinese Intelligent Automation Conference

    CERN Document Server

    Deng, Zhidong

    2013-01-01

    Proceedings of the 2013 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’13, held in Yangzhou, China. The topics include e.g. adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, and reconfigurable control. Engineers and researchers from academia, industry, and government can gain an inside view of new solutions combining ideas from multiple disciplines in the field of intelligent automation. Zengqi Sun and Zhidong Deng are professors at the Department of Computer Science, Tsinghua University, China.

  13. Capacity Analysis of Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    M. I. Gumel

    2012-06-01

    Full Text Available The next generation wireless networks experienced a great development with emergence of wireless mesh networks (WMNs, which can be regarded as a realistic solution that provides wireless broadband access. The limited available bandwidth makes capacity analysis of the network very essential. While the network offers broadband wireless access to community and enterprise users, the problems that limit the network capacity must be addressed to exploit the optimum network performance. The wireless mesh network capacity analysis shows that the throughput of each mesh node degrades in order of l/n with increasing number of nodes (n in a linear topology. The degradation is found to be higher in a fully mesh network as a result of increase in interference and MAC layer contention in the network.

  14. Isochronous wireless network for real-time communication in industrial automation

    CERN Document Server

    Trsek, Henning

    2016-01-01

    This dissertation proposes and investigates an isochronous wireless network for industrial control applications with guaranteed latencies and jitter. Based on a requirements analysis of real industrial applications and the characterisation of the wireless channel, the solution approach is developed. It consists of a TDMA-based medium access control, a dynamic resource allocation and the provision of a global time base for the wired and the wireless network. Due to the global time base, the solution approach allows a seamless and synchronous integration into existing wired Real-time Ethernet systems.

  15. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  16. Automated screening for retinopathy

    Directory of Open Access Journals (Sweden)

    A. S. Rodin

    2014-07-01

    Full Text Available Retinal pathology is a common cause of an irreversible decrease of central vision commonly found amongst senior population. Detection of the earliest signs of retinal diseases can be facilitated by viewing retinal images available from the telemedicine networks. To facilitate the process of retinal images, screening software applications based on image recognition technology are currently on the various stages of development.Purpose: To develop and implement computerized image recognition software that can be used as a decision support technologyfor retinal image screening for various types of retinopathies.Methods: The software application for the retina image recognition has been developed using C++ language. It was tested on dataset of 70 images with various types of pathological features (age related macular degeneration, chorioretinitis, central serous chorioretinopathy and diabetic retinopathy.Results: It was shown that the system can achieve a sensitivity of 73 % and specificity of 72 %.Conclusion: Automated detection of macular lesions using proposed software can significantly reduce manual grading workflow. In addition, automated detection of retinal lesions can be implemented as a clinical decision support system for telemedicine screening. It is anticipated that further development of this technology can become a part of diagnostic image analysis system for the electronic health records.

  17. Automated imaging system for single molecules

    Science.gov (United States)

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  18. Automated derivation of failure symptoms for diagnosis of nuclear plant

    International Nuclear Information System (INIS)

    Washio, T.; Kitamura, M.; Kotajima, K.; Sugiyama, K.

    1986-01-01

    A method of automated derivation of failure symptoms was developed as an approach to computer-aided failure diagnosis in a nuclear power plant. The automated derivation is realized using a knowledge representation called the semantic network (S-net). The purpose of this paper is to demonstrate the applicability of the S-net representation as a basic tool for deriving failure symptoms. If one can generate symptoms automatically, the computer-aided plant safety analysis and diagnosis can be performed easily by evaluating the influence of the failures on the whole plant. A specific description format called a 'network list' was introduced to implement the knowledge of the structure of the plant. The failure symptoms are derived automatically, based on the knowledge of the structure of the plant, using a PROLOG-based database handling system. This approach allows us to derive the failure symptoms of the plant without using conventional event-chain models (e.g. a cause-consequence tree) which are subject to human errors in their design and implementation. Applicability of this method was evaluated with a simulation model of the dynamics of the secondary system of a PWR. (author)

  19. INVESTIGATION OF NEURAL NETWORK ALGORITHM FOR DETECTION OF NETWORK HOST ANOMALIES IN THE AUTOMATED SEARCH FOR XSS VULNERABILITIES AND SQL INJECTIONS

    Directory of Open Access Journals (Sweden)

    Y. D. Shabalin

    2016-03-01

    Full Text Available A problem of aberrant behavior detection for network communicating computer is discussed. A novel approach based on dynamic response of computer is introduced. The computer is suggested as a multiple-input multiple-output (MIMO plant. To characterize dynamic response of the computer on incoming requests a correlation between input data rate and observed output response (outgoing data rate and performance metrics is used. To distinguish normal and aberrant behavior of the computer one-class neural network classifieris used. General idea of the algorithm is shortly described. Configuration of network testbed for experiments with real attacks and their detection is presented (the automated search for XSS and SQL injections. Real found-XSS and SQL injection attack software was used to model the intrusion scenario. It would be expectable that aberrant behavior of the server will reveal itself by some instantaneous correlation response which will be significantly different from any of normal ones. It is evident that correlation picture of attacks from different malware running, the site homepage overriding on the server (so called defacing, hardware and software failures will differ from correlation picture of normal functioning. Intrusion detection algorithm is investigated to estimate false positive and false negative rates in relation to algorithm parameters. The importance of correlation width value and threshold value selection was emphasized. False positive rate was estimated along the time series of experimental data. Some ideas about enhancement of the algorithm quality and robustness were mentioned.

  20. Enhancing the Current Automated Teller Machine (ATM) in Nigerian ...

    African Journals Online (AJOL)

    This is going to be achieved by creating another input device that collects the money into the ATM system, reads its denomination and either saves it or transfers it the required customer ... Keywords: Automated Teller Machine (ATM), Interswitch, Local Area Network (LAN), Wide Area Network (WAN), Telecommunication.

  1. Automated analysis of organic particles using cluster SIMS

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Greg; Zeissler, Cindy; Mahoney, Christine; Lindstrom, Abigail; Fletcher, Robert; Chi, Peter; Verkouteren, Jennifer; Bright, David; Lareau, Richard T.; Boldman, Mike

    2004-06-15

    Cluster primary ion bombardment combined with secondary ion imaging is used on an ion microscope secondary ion mass spectrometer for the spatially resolved analysis of organic particles on various surfaces. Compared to the use of monoatomic primary ion beam bombardment, the use of a cluster primary ion beam (SF{sub 5}{sup +} or C{sub 8}{sup -}) provides significant improvement in molecular ion yields and a reduction in beam-induced degradation of the analyte molecules. These characteristics of cluster bombardment, along with automated sample stage control and custom image analysis software are utilized to rapidly characterize the spatial distribution of trace explosive particles, narcotics and inkjet-printed microarrays on a variety of surfaces.

  2. Analysis of Network Parameters Influencing Performance of Hybrid Multimedia Networks

    Directory of Open Access Journals (Sweden)

    Dominik Kovac

    2013-10-01

    Full Text Available Multimedia networks is an emerging subject that currently attracts the attention of research and industrial communities. This environment provides new entertainment services and business opportunities merged with all well-known network services like VoIP calls or file transfers. Such a heterogeneous system has to be able satisfy all network and end-user requirements which are increasing constantly. Therefore the simulation tools enabling deep analysis in order to find the key performance indicators and factors which influence the overall quality for specific network service the most are highly needed. This paper provides a study on the network parameters like communication technology, routing protocol, QoS mechanism, etc. and their effect on the performance of hybrid multimedia network. The analysis was performed in OPNET Modeler environment and the most interesting results are discussed at the end of this paper

  3. Automation of the Analysis and Classification of the Line Material

    Directory of Open Access Journals (Sweden)

    A. A. Machuev

    2011-03-01

    Full Text Available The work is devoted to the automation of the process of the analysis and verification of various formats of data presentation for what the special software is developed. Working out and testing the special software were made on an example of files with the typical expansions which features of structure are known in advance.

  4. Social network analysis and supply chain management

    Directory of Open Access Journals (Sweden)

    Raúl Rodríguez Rodríguez

    2016-01-01

    Full Text Available This paper deals with social network analysis and how it could be integrated within supply chain management from a decision-making point of view. Even though the benefits of using social analysis have are widely accepted at both academic and industry/services context, there is still a lack of solid frameworks that allow decision-makers to connect the usage and obtained results of social network analysis – mainly both information and knowledge flows and derived results- with supply chain management objectives and goals. This paper gives an overview of social network analysis, the main social network analysis metrics, supply chain performance and, finally, it identifies how future frameworks could close the gap and link the results of social network analysis with the supply chain management decision-making processes.

  5. An independent evaluation of a new method for automated interpretation of lung scintigrams using artificial neural networks

    International Nuclear Information System (INIS)

    Holst, H.; Jaerund, A.; Evander, E.; Taegil, K.; Edenbrandt, L.; Maare, K.; Aastroem, K.; Ohlsson, M.

    2001-01-01

    The purpose of this study was to evaluate a new automated method for the interpretation of lung perfusion scintigrams using patients from a hospital other than that where the method was developed, and then to compare the performance of the technique against that of experienced physicians. A total of 1,087 scintigrams from patients with suspected pulmonary embolism comprised the training group. The test group consisted of scintigrams from 140 patients collected in a hospital different to that from which the training group had been drawn. An artificial neural network was trained using 18 automatically obtained features from each set of perfusion scintigrams. The image processing techniques included alignment to templates, construction of quotient images based on the perfusion/template images, and finally calculation of features describing segmental perfusion defects in the quotient images. The templates represented lungs of normal size and shape without any pathological changes. The performance of the neural network was compared with that of three experienced physicians who read the same test scintigrams according to the modified PIOPED criteria using, in addition to perfusion images, ventilation images when available and chest radiographs for all patients. Performances were measured as area under the receiver operating characteristic curve. The performance of the neural network evaluated in the test group was 0.88 (95% confidence limits 0.81-0.94). The performance of the three experienced experts was in the range 0.87-0.93 when using the perfusion images, chest radiographs and ventilation images when available. Perfusion scintigrams can be interpreted regarding the diagnosis of pulmonary embolism by the use of an automated method also in a hospital other than that where it was developed. The performance of this method is similar to that of experienced physicians even though the physicians, in addition to perfusion images, also had access to ventilation images for

  6. Building Automation Networks for Smart Grids

    Directory of Open Access Journals (Sweden)

    Peizhong Yi

    2011-01-01

    Full Text Available Smart grid, as an intelligent power generation, distribution, and control system, needs various communication systems to meet its requirements. The ability to communicate seamlessly across multiple networks and domains is an open issue which is yet to be adequately addressed in smart grid architectures. In this paper, we present a framework for end-to-end interoperability in home and building area networks within smart grids. 6LoWPAN and the compact application protocol are utilized to facilitate the use of IPv6 and Zigbee application profiles such as Zigbee smart energy for network and application layer interoperability, respectively. A differential service medium access control scheme enables end-to-end connectivity between 802.15.4 and IP networks while providing quality of service guarantees for Zigbee traffic over Wi-Fi. We also address several issues including interference mitigation, load scheduling, and security and propose solutions to them.

  7. VMware vSphere PowerCLI Reference Automating vSphere Administration

    CERN Document Server

    Dekens, Luc; Sizemore, Glenn; van Lieshout, Arnim; Medd, Jonathan

    2011-01-01

    Your One-Stop Reference for VMware vSphere Automation If you manage vSphere in a Windows environment, automating routine tasks can save you time and increase efficiency. VMware vSphere PowerCLI is a set of pre-built commands based on Windows PowerShell that is designed to help you automate vSphere processes involving virtual machines, datacenters, storage, networks, and more. This detailed guide-using a practical, task-based approach and real-world examples-shows you how to get the most out of PowerCLI's handy cmdlets. Learn how to: Automate vCenter Server and ESX/ESX(i) Server deployment and

  8. An Automated Approach to Syntax-based Analysis of Classical Latin

    Directory of Open Access Journals (Sweden)

    Anjalie Field

    2016-12-01

    Full Text Available The goal of this study is to present an automated method for analyzing the style of Latin authors. Many of the common automated methods in stylistic analysis are based on lexical measures, which do not work well with Latin because of the language’s high degree of inflection and free word order. In contrast, this study focuses on analysis at a syntax level by examining two constructions, the ablative absolute and the cum clause. These constructions are often interchangeable, which suggests an author’s choice of construction is typically more stylistic than functional. We first identified these constructions in hand-annotated texts. Next we developed a method for identifying the constructions in unannotated texts, using probabilistic morphological tagging. Our methods identified constructions with enough accuracy to distinguish among different genres and different authors. In particular, we were able to determine which book of Caesar’s Commentarii de Bello Gallico was not written by Caesar. Furthermore, the usage of ablative absolutes and cum clauses observed in this study is consistent with the usage scholars have observed when analyzing these texts by hand. The proposed methods for an automatic syntax-based analysis are shown to be valuable for the study of classical literature.

  9. Automated gamma spectrometry and data analysis on radiometric neutron dosimeters

    International Nuclear Information System (INIS)

    Matsumoto, W.Y.

    1983-01-01

    An automated gamma-ray spectrometry system was designed and implemented by the Westinghouse Hanford Company at the Hanford Engineering Development Laboratory (HEDL) to analyze radiometric neutron dosimeters. Unattended, automatic, 24 hour/day, 7 day/week operation with online data analysis and mainframe-computer compatible magnetic tape output are system features. The system was used to analyze most of the 4000-plus radiometric monitors (RM's) from extensive reactor characterization tests during startup and initial operation of th Fast Flux Test Facility (FFTF). The FFTF, operated by HEDL for the Department of Energy, incorporates a 400 MW(th) sodium-cooled fast reactor. Aumomated system hardware consists of a high purity germanium detector, a computerized multichannel analyzer data acquisition system (Nuclear Data, Inc. Model 6620) with two dual 2.5 Mbyte magnetic disk drives plus two 10.5 inch reel magnetic tape units for mass storage of programs/data and an automated Sample Changer-Positioner (ASC-P) run with a programmable controller. The ASC-P has a 200 sample capacity and 12 calibrated counting (analysis) positions ranging from 6 inches (15 cm) to more than 20 feet (6.1 m) from the detector. The system software was programmed in Fortran at HEDL, except for the Nuclear Data, Inc. Peak Search and Analysis Program and Disk Operating System (MIDAS+)

  10. An Analysis of Database Replication Technologies with Regard to Deep Space Network Application Requirements

    Science.gov (United States)

    Connell, Andrea M.

    2011-01-01

    The Deep Space Network (DSN) has three communication facilities which handle telemetry, commands, and other data relating to spacecraft missions. The network requires these three sites to share data with each other and with the Jet Propulsion Laboratory for processing and distribution. Many database management systems have replication capabilities built in, which means that data updates made at one location will be automatically propagated to other locations. This project examines multiple replication solutions, looking for stability, automation, flexibility, performance, and cost. After comparing these features, Oracle Streams is chosen for closer analysis. Two Streams environments are configured - one with a Master/Slave architecture, in which a single server is the source for all data updates, and the second with a Multi-Master architecture, in which updates originating from any of the servers will be propagated to all of the others. These environments are tested for data type support, conflict resolution, performance, changes to the data structure, and behavior during and after network or server outages. Through this experimentation, it is determined which requirements of the DSN can be met by Oracle Streams and which cannot.

  11. Network meta-analysis: an introduction for pharmacists.

    Science.gov (United States)

    Xu, Yina; Amiche, Mohamed Amine; Tadrous, Mina

    2018-05-21

    Network meta-analysis is a new tool used to summarize and compare studies for multiple interventions, irrespective of whether these interventions have been directly evaluated against each other. Network meta-analysis is quickly becoming the standard in conducting therapeutic reviews and clinical guideline development. However, little guidance is available to help pharmacists review network meta-analysis studies in their practice. Major institutions such as the Cochrane Collaboration, Agency for Healthcare Research and Quality, Canadian Agency for Drugs and Technologies in Health, and National Institute for Health and Care Excellence Decision Support Unit have endorsed utilizing network meta-analysis to establish therapeutic evidence and inform decision making. Our objective is to introduce this novel technique to pharmacy practitioners, and highlight key assumptions behind network meta-analysis studies.

  12. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  13. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  14. Design and Demonstration of Automated Data Analysis Algorithms for Ultrasonic Inspection of Complex Composite Panels with Bonds

    Science.gov (United States)

    2016-02-01

    all of the ADA called indications into three groups: true positives (TP), missed calls (MC) and false calls (FC). Note, an indication position error...data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis ( ADA ) algorithms...thickness and backwall C-scan images. 15. SUBJECT TERMS automated data analysis ( ADA ) algorithms; time-of-flight indications; backwall amplitude dropout

  15. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  16. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  17. Automated processing of measuring information and control processes of eutrophication in water for household purpose, based on artificial neural networks

    Directory of Open Access Journals (Sweden)

    О.М. Безвесільна

    2006-04-01

    Full Text Available  The possibilities of application  informational-computer technologies for automated handling of a measuring information about development of seaweed (evtrofication in household reservoirs are considered. The input data’s for a research of processes evtrofication are videoimages of tests of water, which are used for the definition of geometric characteristics, number and biomass of seaweed. For handling a measuring information the methods of digital handling videoimages and mathematical means of artificial neural networks are offered.

  18. Automated analysis of free speech predicts psychosis onset in high-risk youths

    Science.gov (United States)

    Bedi, Gillinder; Carrillo, Facundo; Cecchi, Guillermo A; Slezak, Diego Fernández; Sigman, Mariano; Mota, Natália B; Ribeiro, Sidarta; Javitt, Daniel C; Copelli, Mauro; Corcoran, Cheryl M

    2015-01-01

    Background/Objectives: Psychiatry lacks the objective clinical tests routinely used in other specializations. Novel computerized methods to characterize complex behaviors such as speech could be used to identify and predict psychiatric illness in individuals. AIMS: In this proof-of-principle study, our aim was to test automated speech analyses combined with Machine Learning to predict later psychosis onset in youths at clinical high-risk (CHR) for psychosis. Methods: Thirty-four CHR youths (11 females) had baseline interviews and were assessed quarterly for up to 2.5 years; five transitioned to psychosis. Using automated analysis, transcripts of interviews were evaluated for semantic and syntactic features predicting later psychosis onset. Speech features were fed into a convex hull classification algorithm with leave-one-subject-out cross-validation to assess their predictive value for psychosis outcome. The canonical correlation between the speech features and prodromal symptom ratings was computed. Results: Derived speech features included a Latent Semantic Analysis measure of semantic coherence and two syntactic markers of speech complexity: maximum phrase length and use of determiners (e.g., which). These speech features predicted later psychosis development with 100% accuracy, outperforming classification from clinical interviews. Speech features were significantly correlated with prodromal symptoms. Conclusions: Findings support the utility of automated speech analysis to measure subtle, clinically relevant mental state changes in emergent psychosis. Recent developments in computer science, including natural language processing, could provide the foundation for future development of objective clinical tests for psychiatry. PMID:27336038

  19. Automated Image Analysis of Offshore Infrastructure Marine Biofouling

    Directory of Open Access Journals (Sweden)

    Kate Gormley

    2018-01-01

    Full Text Available In the UK, some of the oldest oil and gas installations have been in the water for over 40 years and have considerable colonisation by marine organisms, which may lead to both industry challenges and/or potential biodiversity benefits (e.g., artificial reefs. The project objective was to test the use of an automated image analysis software (CoralNet on images of marine biofouling from offshore platforms on the UK continental shelf, with the aim of (i training the software to identify the main marine biofouling organisms on UK platforms; (ii testing the software performance on 3 platforms under 3 different analysis criteria (methods A–C; (iii calculating the percentage cover of marine biofouling organisms and (iv providing recommendations to industry. Following software training with 857 images, and testing of three platforms, results showed that diversity of the three platforms ranged from low (in the central North Sea to moderate (in the northern North Sea. The two central North Sea platforms were dominated by the plumose anemone Metridium dianthus; and the northern North Sea platform showed less obvious species domination. Three different analysis criteria were created, where the method of selection of points, number of points assessed and confidence level thresholds (CT varied: (method A random selection of 20 points with CT 80%, (method B stratified random of 50 points with CT of 90% and (method C a grid approach of 100 points with CT of 90%. Performed across the three platforms, the results showed that there were no significant differences across the majority of species and comparison pairs. No significant difference (across all species was noted between confirmed annotations methods (A, B and C. It was considered that the software performed well for the classification of the main fouling species in the North Sea. Overall, the study showed that the use of automated image analysis software may enable a more efficient and consistent

  20. Performance Analysis of IEEE 802.15.4 Compliant Wireless Devices for Heterogeneous Indoor Home Automation Environments

    Directory of Open Access Journals (Sweden)

    Juan Antonio Nazabal

    2012-01-01

    Full Text Available The influence of topology as well as morphology of complex indoor scenarios in the deployment of wireless sensor networks and wireless systems applied to home and building automation systems is analyzed. The existence of loss mechanisms such as material absorption (walls, furniture, etc. and strong multipath components as well as the increase in the number of wireless sensors within indoor scenarios increases the relevance in the configuration of the heterogeneous wireless systems. Simulation results by means of empirical-based models are compared with an in-house 3D ray launching code as well as measurement results from wireless sensor networks illustrate the strong influence of the indoor scenario in the overall performance. The use of adequate radioplanning strategies lead to optimal wireless network deployments in terms of capacity, quality of service, and reduced power consumption.

  1. Automated analysis for nitrate by hydrazine reduction

    Energy Technology Data Exchange (ETDEWEB)

    Kamphake, L J; Hannah, S A; Cohen, J M

    1967-01-01

    An automated procedure for the simultaneous determinations of nitrate and nitrite in water is presented. Nitrite initially present in the sample is determined by a conventional diazotization-coupling reaction. Nitrate in another portion of sample is quantitatively reduced with hydrazine sulfate to nitrite which is then determined by the same diazotization-coupling reaction. Subtracting the nitrite initially present in the sample from that after reduction yields nitrite equivalent to nitrate initially in the sample. The rate of analysis is 20 samples/hr. Applicable range of the described method is 0.05-10 mg/l nitrite or nitrate nitrogen; however, increased sensitivity can be obtained by suitable modifications.

  2. UAV : Warnings From Multiple Automated Static Analysis Tools At A Glance

    NARCIS (Netherlands)

    Buckers, T.B.; Cao, C.S.; Doesburg, M.S.; Gong, Boning; Wang, Sunwei; Beller, M.M.; Zaidman, A.E.; Pinzger, Martin; Bavota, Gabriele; Marcus, Andrian

    2017-01-01

    Automated Static Analysis Tools (ASATs) are an integral part of today’s software quality assurance practices. At present, a plethora of ASATs exist, each with different strengths. However, there is little guidance for developers on which of these ASATs to choose and combine for a project. As a

  3. An overview of the first decade of PollyNET: an emerging network of automated Raman-polarization lidars for continuous aerosol profiling

    Directory of Open Access Journals (Sweden)

    H. Baars

    2016-04-01

    Full Text Available A global vertically resolved aerosol data set covering more than 10 years of observations at more than 20 measurement sites distributed from 63° N to 52° S and 72° W to 124° E has been achieved within the Raman and polarization lidar network PollyNET. This network consists of portable, remote-controlled multiwavelength-polarization-Raman lidars (Polly for automated and continuous 24/7 observations of clouds and aerosols. PollyNET is an independent, voluntary, and scientific network. All Polly lidars feature a standardized instrument design with different capabilities ranging from single wavelength to multiwavelength systems, and now apply unified calibration, quality control, and data analysis. The observations are processed in near-real time without manual intervention, and are presented online at http://polly.tropos.de/. The paper gives an overview of the observations on four continents and two research vessels obtained with eight Polly systems. The specific aerosol types at these locations (mineral dust, smoke, dust-smoke and other dusty mixtures, urban haze, and volcanic ash are identified by their Ångström exponent, lidar ratio, and depolarization ratio. The vertical aerosol distribution at the PollyNET locations is discussed on the basis of more than 55 000 automatically retrieved 30 min particle backscatter coefficient profiles at 532 nm as this operating wavelength is available for all Polly lidar systems. A seasonal analysis of measurements at selected sites revealed typical and extraordinary aerosol conditions as well as seasonal differences. These studies show the potential of PollyNET to support the establishment of a global aerosol climatology that covers the entire troposphere.

  4. Automated analysis of damages for radiation in plastics surfaces

    International Nuclear Information System (INIS)

    Andrade, C.; Camacho M, E.; Tavera, L.; Balcazar, M.

    1990-02-01

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  5. Networks and Bargaining in Policy Analysis

    DEFF Research Database (Denmark)

    Bogason, Peter

    2006-01-01

    A duscussion of the fight between proponents of rationalistic policy analysis and more political interaction models for policy analysis. The latter group is the foundation for the many network models of policy analysis of today.......A duscussion of the fight between proponents of rationalistic policy analysis and more political interaction models for policy analysis. The latter group is the foundation for the many network models of policy analysis of today....

  6. Control and Automation Systems at the TSO/DSO interface

    DEFF Research Database (Denmark)

    Silvestro, F.; Pilo, F.; Mauri, G.

    2017-01-01

    (Distribution Network Operator) have to assure a secure reliable and good power quality, without taking into consideration any real-time operation of the active components present in their systems. In order to accomplish their missions, DNOs will have to exploit the support of control and automation systems...... and protection systems, but also “external inputs” coming from the Transmission Networks (operated by the Transmission System Operator) and the forthcoming “smart world” (i.e. smart cities, smart transports, smart industries, smart customers etc.). The processing of all such inputs will still have...... to be subordinated to the possibility for Distribution Companies to operate their network under their ultimate responsibility (DSO – Distribution System Operators). This paper presents an overview of the activities of CIGRE C6.25 Working Group (JWG), focusing on the control and automation systems for the future...

  7. NMRNet: A deep learning approach to automated peak picking of protein NMR spectra.

    Science.gov (United States)

    Klukowski, Piotr; Augoff, Michal; Zieba, Maciej; Drwal, Maciej; Gonczarek, Adam; Walczak, Michal J

    2018-03-14

    Automated selection of signals in protein NMR spectra, known as peak picking, has been studied for over 20 years, nevertheless existing peak picking methods are still largely deficient. Accurate and precise automated peak picking would accelerate the structure calculation, and analysis of dynamics and interactions of macromolecules. Recent advancement in handling big data, together with an outburst of machine learning techniques, offer an opportunity to tackle the peak picking problem substantially faster than manual picking and on par with human accuracy. In particular, deep learning has proven to systematically achieve human-level performance in various recognition tasks, and thus emerges as an ideal tool to address automated identification of NMR signals. We have applied a convolutional neural network for visual analysis of multidimensional NMR spectra. A comprehensive test on 31 manually-annotated spectra has demonstrated top-tier average precision (AP) of 0.9596, 0.9058 and 0.8271 for backbone, side-chain and NOESY spectra, respectively. Furthermore, a combination of extracted peak lists with automated assignment routine, FLYA, outperformed other methods, including the manual one, and led to correct resonance assignment at the levels of 90.40%, 89.90% and 90.20% for three benchmark proteins. The proposed model is a part of a Dumpling software (platform for protein NMR data analysis), and is available at https://dumpling.bio/. michaljerzywalczak@gmail.compiotr.klukowski@pwr.edu.pl. Supplementary data are available at Bioinformatics online.

  8. Recent developments in the dissolution and automated analysis of plutonium and uranium for safeguards measurements

    International Nuclear Information System (INIS)

    Jackson, D.D.; Marsh, S.F.; Rein, J.E.; Waterbury, G.R.

    1975-01-01

    The status of a program to develop assay methods for plutonium and uranium for safeguards purposes is presented. The current effort is directed more toward analyses of scrap-type material with an end goal of precise automated methods that also will be applicable to product materials. A guiding philosophy for the analysis of scrap-type materials, characterized by heterogeneity and difficult dissolution, is relatively fast dissolution treatment to effect 90 percent or more solubilization of the uranium and plutonium, analysis of the soluble fraction by precise automated methods, and gamma-counting assay of any residue fraction using simple techniques. A Teflon-container metal-shell apparatus provides acid dissolutions of typical fuel cycle materials at temperatures to 275 0 C and pressures to 340 atm. Gas--solid reactions at elevated temperatures separate uranium from refractory materials by the formation of volatile uranium compounds. The condensed compounds then are dissolved in acid for subsequent analysis. An automated spectrophotometer is used for the determination of uranium and plutonium. The measurement range is 1 to 14 mg of either element with a relative standard deviation of 0.5 percent over most of the range. The throughput rate is 5 min per sample. A second-generation automated instrument is being developed for the determination of plutonium. A precise and specific electroanalytical method is used as its operational basis. (auth)

  9. Taiwan Automated Telescope Network

    Directory of Open Access Journals (Sweden)

    Dean-Yi Chou

    2010-01-01

    can be operated either interactively or fully automatically. In the interactive mode, it can be controlled through the Internet. In the fully automatic mode, the telescope operates with preset parameters without any human care, including taking dark frames and flat frames. The network can also be used for studies that require continuous observations for selected objects.

  10. Superpixel-based and boundary-sensitive convolutional neural network for automated liver segmentation

    Science.gov (United States)

    Qin, Wenjian; Wu, Jia; Han, Fei; Yuan, Yixuan; Zhao, Wei; Ibragimov, Bulat; Gu, Jia; Xing, Lei

    2018-05-01

    Segmentation of liver in abdominal computed tomography (CT) is an important step for radiation therapy planning of hepatocellular carcinoma. Practically, a fully automatic segmentation of liver remains challenging because of low soft tissue contrast between liver and its surrounding organs, and its highly deformable shape. The purpose of this work is to develop a novel superpixel-based and boundary sensitive convolutional neural network (SBBS-CNN) pipeline for automated liver segmentation. The entire CT images were first partitioned into superpixel regions, where nearby pixels with similar CT number were aggregated. Secondly, we converted the conventional binary segmentation into a multinomial classification by labeling the superpixels into three classes: interior liver, liver boundary, and non-liver background. By doing this, the boundary region of the liver was explicitly identified and highlighted for the subsequent classification. Thirdly, we computed an entropy-based saliency map for each CT volume, and leveraged this map to guide the sampling of image patches over the superpixels. In this way, more patches were extracted from informative regions (e.g. the liver boundary with irregular changes) and fewer patches were extracted from homogeneous regions. Finally, deep CNN pipeline was built and trained to predict the probability map of the liver boundary. We tested the proposed algorithm in a cohort of 100 patients. With 10-fold cross validation, the SBBS-CNN achieved mean Dice similarity coefficients of 97.31  ±  0.36% and average symmetric surface distance of 1.77  ±  0.49 mm. Moreover, it showed superior performance in comparison with state-of-art methods, including U-Net, pixel-based CNN, active contour, level-sets and graph-cut algorithms. SBBS-CNN provides an accurate and effective tool for automated liver segmentation. It is also envisioned that the proposed framework is directly applicable in other medical image segmentation scenarios.

  11. Google matrix analysis of directed networks

    Science.gov (United States)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2015-10-01

    In the past decade modern societies have developed enormous communication and social networks. Their classification and information retrieval processing has become a formidable task for the society. Because of the rapid growth of the World Wide Web, and social and communication networks, new mathematical methods have been invented to characterize the properties of these networks in a more detailed and precise way. Various search engines extensively use such methods. It is highly important to develop new tools to classify and rank a massive amount of network information in a way that is adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency using various examples including the World Wide Web, Wikipedia, software architectures, world trade, social and citation networks, brain neural networks, DNA sequences, and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chains, quantum chaos, and random matrix theory.

  12. An automated image analysis system to measure and count organisms in laboratory microcosms.

    Directory of Open Access Journals (Sweden)

    François Mallard

    Full Text Available 1. Because of recent technological improvements in the way computer and digital camera perform, the potential use of imaging for contributing to the study of communities, populations or individuals in laboratory microcosms has risen enormously. However its limited use is due to difficulties in the automation of image analysis. 2. We present an accurate and flexible method of image analysis for detecting, counting and measuring moving particles on a fixed but heterogeneous substrate. This method has been specifically designed to follow individuals, or entire populations, in experimental laboratory microcosms. It can be used in other applications. 3. The method consists in comparing multiple pictures of the same experimental microcosm in order to generate an image of the fixed background. This background is then used to extract, measure and count the moving organisms, leaving out the fixed background and the motionless or dead individuals. 4. We provide different examples (springtails, ants, nematodes, daphnia to show that this non intrusive method is efficient at detecting organisms under a wide variety of conditions even on faintly contrasted and heterogeneous substrates. 5. The repeatability and reliability of this method has been assessed using experimental populations of the Collembola Folsomia candida. 6. We present an ImageJ plugin to automate the analysis of digital pictures of laboratory microcosms. The plugin automates the successive steps of the analysis and recursively analyses multiple sets of images, rapidly producing measurements from a large number of replicated microcosms.

  13. The use of numerical technology for protection, control and automation - a concept description

    International Nuclear Information System (INIS)

    Gjerde, Oddbjoern; Langdal, Bjoern Inge; Kjoelle, Gerd H.; Aaboe, Yngve

    2005-06-01

    In the regulation of network companies of today there is a great focus on supplier reliability and the demands on the use of existing networks increase. Protection, control and automation (PCA) have a central role both with respect to supplier reliability and network utilization. The supplier reliability is also dependent on the installed equipment and the chosen conditions. This also applies to the use of the network that largely depend on the operational margins. This implies that the network company costs will depend on disruptions, maintenance and investments and would be influenced by the chosen protection, control and automation solutions. In the report the concept or the ideas for the use of information from numerical technology in connection with maintenance and handling of supply reliability are described. In operation the focus is on supply reliability and handling of operation disruptions through 1) prevention of faults and avoid disruptions and 2) reduce the consequences of operation disruptions. Examples show that costs in the central network could be reduced with about 60 %. In a selection of regional and distribution networks the costs could be reduced with about 70 %. The maintenance of the primary plant and the PCA equipment may be implemented based on systems and risk considerations. A consequence analysis with respect to economy, availability in the system, the system conditions, the equipment condition and consequences for customers is included. Examples show that the methodology is at best when applied to networks with little redundancy and that the method is applicable in new constructions or refinancing. It is then believed that the network (technical solutions) may be simplified and that necessary security may be obtained at risk based maintenance more than by a high degree of redundancy

  14. Network value and optimum analysis on the mode of networked marketing in TV media

    Directory of Open Access Journals (Sweden)

    Xiao Dongpo

    2012-12-01

    Full Text Available Purpose: With the development of the networked marketing in TV media, it is important to do the research on network value and optimum analysis in this field.Design/methodology/approach: According to the research on the mode of networked marketing in TV media and Correlation theory, the essence of media marketing is creating, spreading and transferring values. The Participants of marketing value activities are in network, and value activities proceed in networked form. Network capability is important to TV media marketing activities.Findings: This article raises the direction of research of analysis and optimization about network based on the mode of networked marketing in TV media by studying TV media marketing Development Mechanism , network analysis and network value structure.

  15. Automated mammographic breast density estimation using a fully convolutional network.

    Science.gov (United States)

    Lee, Juhun; Nishikawa, Robert M

    2018-03-01

    The purpose of this study was to develop a fully automated algorithm for mammographic breast density estimation using deep learning. Our algorithm used a fully convolutional network, which is a deep learning framework for image segmentation, to segment both the breast and the dense fibroglandular areas on mammographic images. Using the segmented breast and dense areas, our algorithm computed the breast percent density (PD), which is the faction of dense area in a breast. Our dataset included full-field digital screening mammograms of 604 women, which included 1208 mediolateral oblique (MLO) and 1208 craniocaudal (CC) views. We allocated 455, 58, and 91 of 604 women and their exams into training, testing, and validation datasets, respectively. We established ground truth for the breast and the dense fibroglandular areas via manual segmentation and segmentation using a simple thresholding based on BI-RADS density assessments by radiologists, respectively. Using the mammograms and ground truth, we fine-tuned a pretrained deep learning network to train the network to segment both the breast and the fibroglandular areas. Using the validation dataset, we evaluated the performance of the proposed algorithm against radiologists' BI-RADS density assessments. Specifically, we conducted a correlation analysis between a BI-RADS density assessment of a given breast and its corresponding PD estimate by the proposed algorithm. In addition, we evaluated our algorithm in terms of its ability to classify the BI-RADS density using PD estimates, and its ability to provide consistent PD estimates for the left and the right breast and the MLO and CC views of the same women. To show the effectiveness of our algorithm, we compared the performance of our algorithm against a state of the art algorithm, laboratory for individualized breast radiodensity assessment (LIBRA). The PD estimated by our algorithm correlated well with BI-RADS density ratings by radiologists. Pearson's rho values of

  16. Automating the conflict resolution process

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  17. Wireless Android Based Home Automation System

    OpenAIRE

    Muhammad Tanveer Riaz; Eman Manzoor Ahmed; Fariha Durrani; Muhammad Asim Mond

    2017-01-01

    This manuscript presents a prototype and design implementation of an advance home automation system that uses Wi-Fi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server, which presents system core that manages and controls user’s home. Users and system administrator can locally (Local Area Network) or remotely (internet) manage and control the system. Second part is the hardware interface module, which p...

  18. A machine learning approach to automated structural network analysis: application to neonatal encephalopathy.

    Directory of Open Access Journals (Sweden)

    Etay Ziv

    Full Text Available Neonatal encephalopathy represents a heterogeneous group of conditions associated with life-long developmental disabilities and neurological deficits. Clinical measures and current anatomic brain imaging remain inadequate predictors of outcome in children with neonatal encephalopathy. Some studies have suggested that brain development and, therefore, brain connectivity may be altered in the subgroup of patients who subsequently go on to develop clinically significant neurological abnormalities. Large-scale structural brain connectivity networks constructed using diffusion tractography have been posited to reflect organizational differences in white matter architecture at the mesoscale, and thus offer a unique tool for characterizing brain development in patients with neonatal encephalopathy. In this manuscript we use diffusion tractography to construct structural networks for a cohort of patients with neonatal encephalopathy. We systematically map these networks to a high-dimensional space and then apply standard machine learning algorithms to predict neurological outcome in the cohort. Using nested cross-validation we demonstrate high prediction accuracy that is both statistically significant and robust over a broad range of thresholds. Our algorithm offers a novel tool to evaluate neonates at risk for developing neurological deficit. The described approach can be applied to any brain pathology that affects structural connectivity.

  19. Comparative analysis of automation of production process with industrial robots in Asia/Australia and Europe

    Directory of Open Access Journals (Sweden)

    I. Karabegović

    2017-01-01

    Full Text Available The term "INDUSTRY 4.0" or "fourth industrial revolution" was first introduced at the fair in 2011 in Hannover. It comes from the high-tech strategy of the German Federal Government that promotes automation-computerization to complete smart automation, meaning the introduction of a method of self-automation, self-configuration, self-diagnosing and fixing the problem, knowledge and intelligent decision-making. Any automation, including smart, cannot be imagined without industrial robots. Along with the fourth industrial revolution, ‘’robotic revolution’’ is taking place in Japan. Robotic revolution refers to the development and research of robotic technology with the aim of using robots in all production processes, and the use of robots in real life, to be of service to a man in daily life. Knowing these facts, an analysis was conducted of the representation of industrial robots in the production processes on the two continents of Europe and Asia /Australia, as well as research that industry is ready for the introduction of intelligent automation with the goal of establishing future smart factories. The paper gives a representation of the automation of production processes in Europe and Asia/Australia, with predictions for the future.

  20. Automated Design and Analysis Tool for CEV Structural and TPS Components, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  1. Automation and Robotics for Space-Based Systems, 1991

    Science.gov (United States)

    Williams, Robert L., II (Editor)

    1992-01-01

    The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.

  2. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  3. Sharing Feelings Online: Studying Emotional Well-Being via Automated Text Analysis of Facebook Posts

    Directory of Open Access Journals (Sweden)

    Michele eSettanni

    2015-07-01

    Full Text Available Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users’ Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed.

  4. Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts.

    Science.gov (United States)

    Settanni, Michele; Marengo, Davide

    2015-01-01

    Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users' Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed.

  5. Automated identification of intergranular corrosion in X-ray CT images

    International Nuclear Information System (INIS)

    Howell, Patricia A.; Winfree, William P.

    2003-01-01

    Characterization of a material or structure by computed tomography results in the acquisition of large quantities of data that need to be tediously examined to determine the location and size of damage. Since the computed tomography images are digital, there is significant potential for reducing the human effort evolved in this process by digital processing of this data to enhance the signatures of flaws and perform automated identification of suspected flaws. Techniques are presented that enhance the contrast between corroded and uncorroded regions to simplify the analysis and improve quality of flaw identification. Algorithms developed in part for computer vision, such as anisotropic diffusion and edge detection techniques, are applied to the data. Anisotropic diffusion techniques are shown to significantly reduce image noise while maintaining the contrast between intergranular corrosion and uncorroded regions and preserving the important features of the flaw. Edge detection techniques are shown to enable a rapid location of regions requiring further analysis. In regions identified by the edge detection technique, neural network techniques are applied to automate defect detection of the intergranular corrosion

  6. Semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma stage III/IV-A feasibility study

    International Nuclear Information System (INIS)

    Fabel, M.; Tengg-Kobligk, H. von; Giesel, F.L.; Delorme, S.; Kauczor, H.-U.; Bornemann, L.; Dicken, V.; Kopp-Schneider, A.; Moser, C.

    2008-01-01

    Therapy monitoring in oncological patient care requires accurate and reliable imaging and post-processing methods. RECIST criteria are the current standard, with inherent disadvantages. The aim of this study was to investigate the feasibility of semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma compared to manual volumetric analysis and RECIST. Multislice CT was performed in 47 patients, covering the chest, abdomen and pelvis. In total, 227 suspicious, enlarged lymph nodes were evaluated retrospectively by two radiologists regarding diameters (RECIST), manually measured volume by placement of ROIs and semi-automated volumetric analysis. Volume (ml), quality of segmentation (++/-) and time effort (s) were evaluated in the study. The semi-automated volumetric analysis software tool was rated acceptable to excellent in 81% of all cases (reader 1) and 79% (reader 2). Median time for the entire segmentation process and necessary corrections was shorter with the semi-automated software than by manual segmentation. Bland-Altman plots showed a significantly lower interobserver variability for semi-automated volumetric than for RECIST measurements. The study demonstrated feasibility of volumetric analysis of lymph node metastases. The software allows a fast and robust segmentation in up to 80% of all cases. Ease of use and time needed are acceptable for application in the clinical routine. Variability and interuser bias were reduced to about one third of the values found for RECIST measurements. (orig.)

  7. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  8. An automated robotic platform for rapid profiling oligosaccharide analysis of monoclonal antibodies directly from cell culture.

    Science.gov (United States)

    Doherty, Margaret; Bones, Jonathan; McLoughlin, Niaobh; Telford, Jayne E; Harmon, Bryan; DeFelippis, Michael R; Rudd, Pauline M

    2013-11-01

    Oligosaccharides attached to Asn297 in each of the CH2 domains of monoclonal antibodies play an important role in antibody effector functions by modulating the affinity of interaction with Fc receptors displayed on cells of the innate immune system. Rapid, detailed, and quantitative N-glycan analysis is required at all stages of bioprocess development to ensure the safety and efficacy of the therapeutic. The high sample numbers generated during quality by design (QbD) and process analytical technology (PAT) create a demand for high-performance, high-throughput analytical technologies for comprehensive oligosaccharide analysis. We have developed an automated 96-well plate-based sample preparation platform for high-throughput N-glycan analysis using a liquid handling robotic system. Complete process automation includes monoclonal antibody (mAb) purification directly from bioreactor media, glycan release, fluorescent labeling, purification, and subsequent ultra-performance liquid chromatography (UPLC) analysis. The entire sample preparation and commencement of analysis is achieved within a 5-h timeframe. The automated sample preparation platform can easily be interfaced with other downstream analytical technologies, including mass spectrometry (MS) and capillary electrophoresis (CE), for rapid characterization of oligosaccharides present on therapeutic antibodies. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Correlation of the UV-induced mutational spectra and the DNA damage distribution of the human HPRT gene: Automating the analysis

    International Nuclear Information System (INIS)

    Kotturi, G.; Erfle, H.; Koop, B.F.; Boer, J.G. de; Glickman, B.W.

    1994-01-01

    Automated DNA sequencers can be readily adapted for various types of sequence-based nucleic acid analysis: more recently it was determined the distribution of UV photoproducts in the E. coli laci gene using techniques developed for automated fluorescence-based analysis. We have been working to improve the automated approach of damage distribution. Our current method is more rigorous. We have new software that integrates the area under the individual peaks, rather than measuring the height of the curve. In addition, we now employ an internal standard. The analysis can also be partially automated. Detection limits for both major types of UV-photoproducts (cyclobutane dimers and pyrimidine (6-4) pyrimidone photoproducts) are reported. The UV-induced damage distribution in the hprt gene is compared to the mutational spectra in human and rodents cells

  10. Automated quantitative analysis of in-situ NaI measured spectra in the marine environment using a wavelet-based smoothing technique

    International Nuclear Information System (INIS)

    Tsabaris, Christos; Prospathopoulos, Aristides

    2011-01-01

    An algorithm for automated analysis of in-situ NaI γ-ray spectra in the marine environment is presented. A standard wavelet denoising technique is implemented for obtaining a smoothed spectrum, while the stability of the energy spectrum is achieved by taking advantage of the permanent presence of two energy lines in the marine environment. The automated analysis provides peak detection, net area calculation, energy autocalibration, radionuclide identification and activity calculation. The results of the algorithm performance, presented for two different cases, show that analysis of short-term spectra with poor statistical information is considerably improved and that incorporation of further advancements could allow the use of the algorithm in early-warning marine radioactivity systems. - Highlights: → Algorithm for automated analysis of in-situ NaI γ-ray marine spectra. → Wavelet denoising technique provides smoothed spectra even at parts of the energy spectrum that exhibits strong statistical fluctuations. → Automated analysis provides peak detection, net area calculation, energy autocalibration, radionuclide identification and activity calculation. → Analysis of short-term spectra with poor statistical information is considerably improved.

  11. On the emergence of pervasive home automation

    DEFF Research Database (Denmark)

    Torbensen, Rune Sonnich

    2012-01-01

    bridges and expandable via USB modules so that new data-communication technologies can be connected to cover the whole residence. An ’IHAP ready’ flexible communication software framework that supports wireless end-device development is proposed. The idea is to bootstrap this new market with open source...... end-devices via the Open Device Service Description Language, which employs abstract service types to describe transformations to the simple end-device application protocols used in home automation. A security system called Trusted Domain grants access for remote control of home automation devices...... by smartphones, M2M applications, and service providers. These Internet nodes and home gateways become the trusted members of a home network capable of spanning several network segments. To include new members, both locally and remotely, a new user-friendly method for establishing trust between remote devices...

  12. Automated Inadvertent Intruder Application

    International Nuclear Information System (INIS)

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  13. COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks

    NARCIS (Netherlands)

    Sie, Rory

    2012-01-01

    Sie, R. L. L. (2012). COalitions in COOperation Networks (COCOON): Social Network Analysis and Game Theory to Enhance Cooperation Networks (Unpublished doctoral dissertation). September, 28, 2012, Open Universiteit in the Netherlands (CELSTEC), Heerlen, The Netherlands.

  14. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  15. Deep convolutional neural networks for detection of rail surface defects

    NARCIS (Netherlands)

    Faghih Roohi, S.; Hajizadeh, S.; Nunez Vicencio, Alfredo; Babuska, R.; De Schutter, B.H.K.; Estevez, Pablo A.; Angelov, Plamen P.; Del Moral Hernandez, Emilio

    2016-01-01

    In this paper, we propose a deep convolutional neural network solution to the analysis of image data for the detection of rail surface defects. The images are obtained from many hours of automated video recordings. This huge amount of data makes it impossible to manually inspect the images and

  16. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    Simola, K.

    2000-01-01

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  17. An Analysis of Information Assurance Relating to the Department of Defense Radio Frequency Identification (RFID) Passive Network

    National Research Council Canada - National Science Library

    Giovannetti, Robert G

    2005-01-01

    .... Despite the many conveniences to automate and improve asset tracking this technology offers, consumer groups have obstinately opposed this adoption due to the perceived weaknesses in security and privacy of the network...

  18. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  19. Network analysis for the visualization and analysis of qualitative data.

    Science.gov (United States)

    Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D

    2018-03-01

    We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Automated analysis in generic groups

    Science.gov (United States)

    Fagerholm, Edvard

    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an

  1. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  2. A tool for the security configuration of sensor networks

    International Nuclear Information System (INIS)

    Cionca, V; Newe, T; Dadarlat, V

    2009-01-01

    It is difficult to select a set of protocols that provides the appropriate level of security for a given application. It requires in depth analysis of the application with extensive knowledge of both security and sensor networks, which will generally not be available to nonexpert users like network deployers or clients. We present a method to configure security using only parameters taken from application space, and a tool that implements this method, thus automating the process of security configuration for non-expert users.

  3. A tool for the security configuration of sensor networks

    Energy Technology Data Exchange (ETDEWEB)

    Cionca, V; Newe, T [Electronic and Computer Engineering, University of Limerick (Ireland); Dadarlat, V, E-mail: Victor.Cionca@ul.i [Computer Science, Technical University of Cluj-Napoca (Romania)

    2009-07-01

    It is difficult to select a set of protocols that provides the appropriate level of security for a given application. It requires in depth analysis of the application with extensive knowledge of both security and sensor networks, which will generally not be available to nonexpert users like network deployers or clients. We present a method to configure security using only parameters taken from application space, and a tool that implements this method, thus automating the process of security configuration for non-expert users.

  4. Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis

    Directory of Open Access Journals (Sweden)

    Chernoded Andrey

    2017-01-01

    Full Text Available Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.

  5. Elemental misinterpretation in automated analysis of LIBS spectra.

    Science.gov (United States)

    Hübert, Waldemar; Ankerhold, Georg

    2011-07-01

    In this work, the Stark effect is shown to be mainly responsible for wrong elemental allocation by automated laser-induced breakdown spectroscopy (LIBS) software solutions. Due to broadening and shift of an elemental emission line affected by the Stark effect, its measured spectral position might interfere with the line position of several other elements. The micro-plasma is generated by focusing a frequency-doubled 200 mJ pulsed Nd/YAG laser on an aluminum target and furthermore on a brass sample in air at atmospheric pressure. After laser pulse excitation, we have measured the temporal evolution of the Al(II) ion line at 281.6 nm (4s(1)S-3p(1)P) during the decay of the laser-induced plasma. Depending on laser pulse power, the center of the measured line is red-shifted by 130 pm (490 GHz) with respect to the exact line position. In this case, the well-known spectral line positions of two moderate and strong lines of other elements coincide with the actual shifted position of the Al(II) line. Consequently, a time-resolving software analysis can lead to an elemental misinterpretation. To avoid a wrong interpretation of LIBS spectra in automated analysis software for a given LIBS system, we recommend using larger gate delays incorporating Stark broadening parameters and using a range of tolerance, which is non-symmetric around the measured line center. These suggestions may help to improve time-resolving LIBS software promising a smaller probability of wrong elemental identification and making LIBS more attractive for industrial applications.

  6. Network Analysis in Community Psychology: Looking Back, Looking Forward

    OpenAIRE

    Neal, Zachary P.; Neal, Jennifer Watling

    2017-01-01

    Highlights Network analysis is ideally suited for community psychology research because it focuses on context. Use of network analysis in community psychology is growing. Network analysis in community psychology has employed some potentially problematic practices. Recommended practices are identified to improve network analysis in community psychology.

  7. International Conference Automation : Challenges in Automation, Robotics and Measurement Techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2016-01-01

    This book presents the set of papers accepted for presentation at the International Conference Automation, held in Warsaw, 2-4 March of 2016. It presents the research results presented by top experts in the fields of industrial automation, control, robotics and measurement techniques. Each chapter presents a thorough analysis of a specific technical problem which is usually followed by numerical analysis, simulation, and description of results of implementation of the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be valuable for both researchers working in the area of engineering sciences and for practitioners solving industrial problems. .

  8. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  9. Functional Module Analysis for Gene Coexpression Networks with Network Integration.

    Science.gov (United States)

    Zhang, Shuqin; Zhao, Hongyu; Ng, Michael K

    2015-01-01

    Network has been a general tool for studying the complex interactions between different genes, proteins, and other small molecules. Module as a fundamental property of many biological networks has been widely studied and many computational methods have been proposed to identify the modules in an individual network. However, in many cases, a single network is insufficient for module analysis due to the noise in the data or the tuning of parameters when building the biological network. The availability of a large amount of biological networks makes network integration study possible. By integrating such networks, more informative modules for some specific disease can be derived from the networks constructed from different tissues, and consistent factors for different diseases can be inferred. In this paper, we have developed an effective method for module identification from multiple networks under different conditions. The problem is formulated as an optimization model, which combines the module identification in each individual network and alignment of the modules from different networks together. An approximation algorithm based on eigenvector computation is proposed. Our method outperforms the existing methods, especially when the underlying modules in multiple networks are different in simulation studies. We also applied our method to two groups of gene coexpression networks for humans, which include one for three different cancers, and one for three tissues from the morbidly obese patients. We identified 13 modules with three complete subgraphs, and 11 modules with two complete subgraphs, respectively. The modules were validated through Gene Ontology enrichment and KEGG pathway enrichment analysis. We also showed that the main functions of most modules for the corresponding disease have been addressed by other researchers, which may provide the theoretical basis for further studying the modules experimentally.

  10. Network performance analysis

    CERN Document Server

    Bonald, Thomas

    2013-01-01

    The book presents some key mathematical tools for the performance analysis of communication networks and computer systems.Communication networks and computer systems have become extremely complex. The statistical resource sharing induced by the random behavior of users and the underlying protocols and algorithms may affect Quality of Service.This book introduces the main results of queuing theory that are useful for analyzing the performance of these systems. These mathematical tools are key to the development of robust dimensioning rules and engineering methods. A number of examples i

  11. The International Trade Network: weighted network analysis and modelling

    International Nuclear Information System (INIS)

    Bhattacharya, K; Mukherjee, G; Manna, S S; Saramäki, J; Kaski, K

    2008-01-01

    Tools of the theory of critical phenomena, namely the scaling analysis and universality, are argued to be applicable to large complex web-like network structures. Using a detailed analysis of the real data of the International Trade Network we argue that the scaled link weight distribution has an approximate log-normal distribution which remains robust over a period of 53 years. Another universal feature is observed in the power-law growth of the trade strength with gross domestic product, the exponent being similar for all countries. Using the 'rich-club' coefficient measure of the weighted networks it has been shown that the size of the rich-club controlling half of the world's trade is actually shrinking. While the gravity law is known to describe well the social interactions in the static networks of population migration, international trade, etc, here for the first time we studied a non-conservative dynamical model based on the gravity law which excellently reproduced many empirical features of the ITN

  12. Design Automation in Synthetic Biology.

    Science.gov (United States)

    Appleton, Evan; Madsen, Curtis; Roehner, Nicholas; Densmore, Douglas

    2017-04-03

    Design automation refers to a category of software tools for designing systems that work together in a workflow for designing, building, testing, and analyzing systems with a target behavior. In synthetic biology, these tools are called bio-design automation (BDA) tools. In this review, we discuss the BDA tools areas-specify, design, build, test, and learn-and introduce the existing software tools designed to solve problems in these areas. We then detail the functionality of some of these tools and show how they can be used together to create the desired behavior of two types of modern synthetic genetic regulatory networks. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  13. Investigating changes in brain network properties in HIV-associated neurocognitive disease (HAND) using mutual connectivity analysis (MCA)

    Science.gov (United States)

    Abidin, Anas Zainul; D'Souza, Adora M.; Nagarajan, Mahesh B.; Wismüller, Axel

    2016-03-01

    About 50% of subjects infected with HIV present deficits in cognitive domains, which are known collectively as HIV associated neurocognitive disorder (HAND). The underlying synaptodendritic damage can be captured using resting state functional MRI, as has been demonstrated by a few earlier studies. Such damage may induce topological changes of brain connectivity networks. We test this hypothesis by capturing the functional interdependence of 90 brain network nodes using a Mutual Connectivity Analysis (MCA) framework with non-linear time series modeling based on Generalized Radial Basis function (GRBF) neural networks. The network nodes are selected based on the regions defined in the Automated Anatomic Labeling (AAL) atlas. Each node is represented by the average time series of the voxels of that region. The resulting networks are then characterized using graph-theoretic measures that quantify various network topology properties at a global as well as at a local level. We tested for differences in these properties in network graphs obtained for 10 subjects (6 male and 4 female, 5 HIV+ and 5 HIV-). Global network properties captured some differences between these subject cohorts, though significant differences were seen only with the clustering coefficient measure. Local network properties, such as local efficiency and the degree of connections, captured significant differences in regions of the frontal lobe, precentral and cingulate cortex amongst a few others. These results suggest that our method can be used to effectively capture differences occurring in brain network connectivity properties revealed by resting-state functional MRI in neurological disease states, such as HAND.

  14. Research on Fault Prediction of Distribution Network Based on Large Data

    Directory of Open Access Journals (Sweden)

    Jinglong Zhou

    2017-01-01

    Full Text Available With the continuous development of information technology and the improvement of distribution automation level. Especially, the amount of on-line monitoring and statistical data is increasing, and large data is used data distribution system, describes the technology to collect, data analysis and data processing of the data distribution system. The artificial neural network mining algorithm and the large data are researched in the fault diagnosis and prediction of the distribution network.

  15. A guide to the automation body of knowledge

    CERN Document Server

    2006-01-01

    Edited by Vernon L. Trevathan, with contributions from more than 35 leading experts from all aspects of automation, this book is a key resource for anyone who is studying for the ISA Certified Automation Professional® (CAP®), ISA Certified Control Systems Technician® (CCST®), and/or Control Systems Engineer (CSE) exams. The book defines the most important automation concepts and processes, while also describing the technical skills required to implement them in today's industrial environment. This edition provides comprehensive information about all major topics in the broad field of automation including: Process and analytical instrumentation ; Continuous and batch control ; Control valves and final control elements ; Basic discrete, sequencing, and manufacturing control ; Advanced control ; Digital and analog communications ; Data management and system software ; Networking and security ; Safety and reliability ; System checkout, testing, startup, and troubleshooting ; Project management. Whether you ar...

  16. 4th International Conference in Network Analysis

    CERN Document Server

    Koldanov, Petr; Pardalos, Panos

    2016-01-01

    The contributions in this volume cover a broad range of topics including maximum cliques, graph coloring, data mining, brain networks, Steiner forest, logistic and supply chain networks. Network algorithms and their applications to market graphs, manufacturing problems, internet networks and social networks are highlighted. The "Fourth International Conference in Network Analysis," held at the Higher School of Economics, Nizhny Novgorod in May 2014, initiated joint research between scientists, engineers and researchers from academia, industry and government; the major results of conference participants have been reviewed and collected in this Work. Researchers and students in mathematics, economics, statistics, computer science and engineering will find this collection a valuable resource filled with the latest research in network analysis.

  17. Complex Network Analysis of Guangzhou Metro

    OpenAIRE

    Yasir Tariq Mohmand; Fahad Mehmood; Fahd Amjad; Nedim Makarevic

    2015-01-01

    The structure and properties of public transportation networks can provide suggestions for urban planning and public policies. This study contributes a complex network analysis of the Guangzhou metro. The metro network has 236 kilometers of track and is the 6th busiest metro system of the world. In this paper topological properties of the network are explored. We observed that the network displays small world properties and is assortative in nature. The network possesses a high average degree...

  18. Automated Item Generation with Recurrent Neural Networks.

    Science.gov (United States)

    von Davier, Matthias

    2018-03-12

    Utilizing technology for automated item generation is not a new idea. However, test items used in commercial testing programs or in research are still predominantly written by humans, in most cases by content experts or professional item writers. Human experts are a limited resource and testing agencies incur high costs in the process of continuous renewal of item banks to sustain testing programs. Using algorithms instead holds the promise of providing unlimited resources for this crucial part of assessment development. The approach presented here deviates in several ways from previous attempts to solve this problem. In the past, automatic item generation relied either on generating clones of narrowly defined item types such as those found in language free intelligence tests (e.g., Raven's progressive matrices) or on an extensive analysis of task components and derivation of schemata to produce items with pre-specified variability that are hoped to have predictable levels of difficulty. It is somewhat unlikely that researchers utilizing these previous approaches would look at the proposed approach with favor; however, recent applications of machine learning show success in solving tasks that seemed impossible for machines not too long ago. The proposed approach uses deep learning to implement probabilistic language models, not unlike what Google brain and Amazon Alexa use for language processing and generation.

  19. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  20. Pareto distance for multi-layer network analysis

    DEFF Research Database (Denmark)

    Magnani, Matteo; Rossi, Luca

    2013-01-01

    services, e.g., Facebook, Twitter, LinkedIn and Foursquare. As a result, the analysis of on-line social networks requires a wider scope and, more technically speaking, models for the representation of this fragmented scenario. The recent introduction of more realistic layered models has however determined......Social Network Analysis has been historically applied to single networks, e.g., interaction networks between co-workers. However, the advent of on-line social network sites has emphasized the stratified structure of our social experience. Individuals usually spread their identities over multiple...

  1. UMA/GAN network architecture analysis

    Science.gov (United States)

    Yang, Liang; Li, Wensheng; Deng, Chunjian; Lv, Yi

    2009-07-01

    This paper is to critically analyze the architecture of UMA which is one of Fix Mobile Convergence (FMC) solutions, and also included by the third generation partnership project(3GPP). In UMA/GAN network architecture, UMA Network Controller (UNC) is the key equipment which connects with cellular core network and mobile station (MS). UMA network could be easily integrated into the existing cellular networks without influencing mobile core network, and could provides high-quality mobile services with preferentially priced indoor voice and data usage. This helps to improve subscriber's experience. On the other hand, UMA/GAN architecture helps to integrate other radio technique into cellular network which includes WiFi, Bluetooth, and WiMax and so on. This offers the traditional mobile operators an opportunity to integrate WiMax technique into cellular network. In the end of this article, we also give an analysis of potential influence on the cellular core networks ,which is pulled by UMA network.

  2. Home Network Security

    NARCIS (Netherlands)

    Scholten, Hans; van Dijk, Hylke

    2008-01-01

    Service discovery and secure and safe service usage are essential elements in the deployment of home and personal networks. Because no system administrator is present, setup and daily operation of such a network has to be automated as much as possible with a high degree of user friendliness. To

  3. Decision-making in irrigation networks: Selecting appropriate canal structures using multi-attribute decision analysis.

    Science.gov (United States)

    Hosseinzade, Zeinab; Pagsuyoin, Sheree A; Ponnambalam, Kumaraswamy; Monem, Mohammad J

    2017-12-01

    The stiff competition for water between agriculture and non-agricultural production sectors makes it necessary to have effective management of irrigation networks in farms. However, the process of selecting flow control structures in irrigation networks is highly complex and involves different levels of decision makers. In this paper, we apply multi-attribute decision making (MADM) methodology to develop a decision analysis (DA) framework for evaluating, ranking and selecting check and intake structures for irrigation canals. The DA framework consists of identifying relevant attributes for canal structures, developing a robust scoring system for alternatives, identifying a procedure for data quality control, and identifying a MADM model for the decision analysis. An application is illustrated through an analysis for automation purposes of the Qazvin irrigation network, one of the oldest and most complex irrigation networks in Iran. A survey questionnaire designed based on the decision framework was distributed to experts, managers, and operators of the Qazvin network and to experts from the Ministry of Power in Iran. Five check structures and four intake structures were evaluated. A decision matrix was generated from the average scores collected from the survey, and was subsequently solved using TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) method. To identify the most critical structure attributes for the selection process, optimal attribute weights were calculated using Entropy method. For check structures, results show that the duckbill weir is the preferred structure while the pivot weir is the least preferred. Use of the duckbill weir can potentially address the problem with existing Amil gates where manual intervention is required to regulate water levels during periods of flow extremes. For intake structures, the Neyrpic® gate and constant head orifice are the most and least preferred alternatives, respectively. Some advantages

  4. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  5. Estimating and suppressing background in Raman spectra with an artificial neural network

    DEFF Research Database (Denmark)

    Sigurdsson, Sigurdur; Larsen, Jan; Philipsen, Peter Alshede

    2003-01-01

    In this report we address the problem of skin fluorescence in feature extraction from Raman spectra of skin lesions. We apply a highly automated neural network method for suppressing skin fluorescence from Raman spectrum of skin lesions before dimension reduction with principal components analysi...

  6. GapCoder automates the use of indel characters in phylogenetic analysis.

    Science.gov (United States)

    Young, Nelson D; Healy, John

    2003-02-19

    Several ways of incorporating indels into phylogenetic analysis have been suggested. Simple indel coding has two strengths: (1) biological realism and (2) efficiency of analysis. In the method, each indel with different start and/or end positions is considered to be a separate character. The presence/absence of these indel characters is then added to the data set. We have written a program, GapCoder to automate this procedure. The program can input PIR format aligned datasets, find the indels and add the indel-based characters. The output is a NEXUS format file, which includes a table showing what region each indel characters is based on. If regions are excluded from analysis, this table makes it easy to identify the corresponding indel characters for exclusion. Manual implementation of the simple indel coding method can be very time-consuming, especially in data sets where indels are numerous and/or overlapping. GapCoder automates this method and is therefore particularly useful during procedures where phylogenetic analyses need to be repeated many times, such as when different alignments are being explored or when various taxon or character sets are being explored. GapCoder is currently available for Windows from http://www.home.duq.edu/~youngnd/GapCoder.

  7. Approach to analysis of single nucleotide polymorphisms by automated constant denaturant capillary electrophoresis

    International Nuclear Information System (INIS)

    Bjoerheim, Jens; Abrahamsen, Torveig Weum; Kristensen, Annette Torgunrud; Gaudernack, Gustav; Ekstroem, Per O.

    2003-01-01

    Melting gel techniques have proven to be amenable and powerful tools in point mutation and single nucleotide polymorphism (SNP) analysis. With the introduction of commercially available capillary electrophoresis instruments, a partly automated platform for denaturant capillary electrophoresis with potential for routine screening of selected target sequences has been established. The aim of this article is to demonstrate the use of automated constant denaturant capillary electrophoresis (ACDCE) in single nucleotide polymorphism analysis of various target sequences. Optimal analysis conditions for different single nucleotide polymorphisms on ACDCE are evaluated with the Poland algorithm. Laboratory procedures include only PCR and electrophoresis. For direct genotyping of individual SNPs, the samples are analyzed with an internal standard and the alleles are identified by co-migration of sample and standard peaks. In conclusion, SNPs suitable for melting gel analysis based on theoretical thermodynamics were separated by ACDCE under appropriate conditions. With this instrumentation (ABI 310 Genetic Analyzer), 48 samples could be analyzed without any intervention. Several institutions have capillary instrumentation in-house, thus making this SNP analysis method accessible to large groups of researchers without any need for instrument modification

  8. Automated data acquisition technology development:Automated modeling and control development

    Science.gov (United States)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  9. Improving automated multiple sclerosis lesion segmentation with a cascaded 3D convolutional neural network approach.

    Science.gov (United States)

    Valverde, Sergi; Cabezas, Mariano; Roura, Eloy; González-Villà, Sandra; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Rovira, Àlex; Oliver, Arnau; Lladó, Xavier

    2017-07-15

    In this paper, we present a novel automated method for White Matter (WM) lesion segmentation of Multiple Sclerosis (MS) patient images. Our approach is based on a cascade of two 3D patch-wise convolutional neural networks (CNN). The first network is trained to be more sensitive revealing possible candidate lesion voxels while the second network is trained to reduce the number of misclassified voxels coming from the first network. This cascaded CNN architecture tends to learn well from a small (n≤35) set of labeled data of the same MRI contrast, which can be very interesting in practice, given the difficulty to obtain manual label annotations and the large amount of available unlabeled Magnetic Resonance Imaging (MRI) data. We evaluate the accuracy of the proposed method on the public MS lesion segmentation challenge MICCAI2008 dataset, comparing it with respect to other state-of-the-art MS lesion segmentation tools. Furthermore, the proposed method is also evaluated on two private MS clinical datasets, where the performance of our method is also compared with different recent public available state-of-the-art MS lesion segmentation methods. At the time of writing this paper, our method is the best ranked approach on the MICCAI2008 challenge, outperforming the rest of 60 participant methods when using all the available input modalities (T1-w, T2-w and FLAIR), while still in the top-rank (3rd position) when using only T1-w and FLAIR modalities. On clinical MS data, our approach exhibits a significant increase in the accuracy segmenting of WM lesions when compared with the rest of evaluated methods, highly correlating (r≥0.97) also with the expected lesion volume. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  11. Automated Slide Scanning and Segmentation in Fluorescently-labeled Tissues Using a Widefield High-content Analysis System.

    Science.gov (United States)

    Poon, Candice C; Ebacher, Vincent; Liu, Katherine; Yong, Voon Wee; Kelly, John James Patrick

    2018-05-03

    Automated slide scanning and segmentation of fluorescently-labeled tissues is the most efficient way to analyze whole slides or large tissue sections. Unfortunately, many researchers spend large amounts of time and resources developing and optimizing workflows that are only relevant to their own experiments. In this article, we describe a protocol that can be used by those with access to a widefield high-content analysis system (WHCAS) to image any slide-mounted tissue, with options for customization within pre-built modules found in the associated software. Not originally intended for slide scanning, the steps detailed in this article make it possible to acquire slide scanning images in the WHCAS which can be imported into the associated software. In this example, the automated segmentation of brain tumor slides is demonstrated, but the automated segmentation of any fluorescently-labeled nuclear or cytoplasmic marker is possible. Furthermore, there are a variety of other quantitative software modules including assays for protein localization/translocation, cellular proliferation/viability/apoptosis, and angiogenesis that can be run. This technique will save researchers time and effort and create an automated protocol for slide analysis.

  12. A unified plant information network

    International Nuclear Information System (INIS)

    Niederauer, G.F.

    1986-01-01

    Technology is bringing power plants fully into the age of computerization. Microcomputers, data base managers, networking, and friendly, expert software are principal technology factors. Monitoring will improve, and the number and power of computers is increasing. The huge information flow will cause computers to be integrated into a communication network. The total plant operating triangle includes process, engineering, and management systems. The total network will integrate all of these into a Total Unified Plant Information Network (TUPIN). Software will take the type of information beyond monitored data. Analysis will improve through direct access to logical, physical, and procedural models by end users. Information management will improve through widespread use of hierarchical, relational, and expert data base managers. Expert systems will aid in diagnostics and interpretation. The goal is to automate plant operations to enhance safety and performance and to reduce cost by making both the plants and the personnel more expert

  13. Historical Network Analysis of the Web

    DEFF Research Database (Denmark)

    Brügger, Niels

    2013-01-01

    This article discusses some of the fundamental methodological challenges related to doing historical network analyses of the web based on material in web archives. Since the late 1990s many countries have established extensive national web archives, and software supported network analysis...... of the online web has for a number of years gained currency within Internet studies. However, the combination of these two phenomena—historical network analysis of material in web archives—can at best be characterized as an emerging new area of study. Most of the methodological challenges within this new area...... revolve around the specific nature of archived web material. On the basis of an introduction to the processes involved in web archiving as well as of the characteristics of archived web material, the article outlines and scrutinizes some of the major challenges which may arise when doing network analysis...

  14. Automated analysis of invadopodia dynamics in live cells

    Directory of Open Access Journals (Sweden)

    Matthew E. Berginski

    2014-07-01

    Full Text Available Multiple cell types form specialized protein complexes that are used by the cell to actively degrade the surrounding extracellular matrix. These structures are called podosomes or invadopodia and collectively referred to as invadosomes. Due to their potential importance in both healthy physiology as well as in pathological conditions such as cancer, the characterization of these structures has been of increasing interest. Following early descriptions of invadopodia, assays were developed which labelled the matrix underneath metastatic cancer cells allowing for the assessment of invadopodia activity in motile cells. However, characterization of invadopodia using these methods has traditionally been done manually with time-consuming and potentially biased quantification methods, limiting the number of experiments and the quantity of data that can be analysed. We have developed a system to automate the segmentation, tracking and quantification of invadopodia in time-lapse fluorescence image sets at both the single invadopodia level and whole cell level. We rigorously tested the ability of the method to detect changes in invadopodia formation and dynamics through the use of well-characterized small molecule inhibitors, with known effects on invadopodia. Our results demonstrate the ability of this analysis method to quantify changes in invadopodia formation from live cell imaging data in a high throughput, automated manner.

  15. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  16. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Ján eAntolík

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity.To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  17. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    International Nuclear Information System (INIS)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F.; Prasanna, P.G.S.

    2007-01-01

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and medical

  18. Sample tracking in an automated cytogenetic biodosimetry laboratory for radiation mass casualties

    Energy Technology Data Exchange (ETDEWEB)

    Martin, P.R.; Berdychevski, R.E.; Subramanian, U.; Blakely, W.F. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States); Prasanna, P.G.S. [Armed Forces Radiobiology Research Institute, Uniformed Services University of Health Sciences, 8901 Wisconsin Avenue, Bethesda, MD 20889-5603 (United States)], E-mail: prasanna@afrri.usuhs.mil

    2007-07-15

    Chromosome-aberration-based dicentric assay is expected to be used after mass-casualty life-threatening radiation exposures to assess radiation dose to individuals. This will require processing of a large number of samples for individual dose assessment and clinical triage to aid treatment decisions. We have established an automated, high-throughput, cytogenetic biodosimetry laboratory to process a large number of samples for conducting the dicentric assay using peripheral blood from exposed individuals according to internationally accepted laboratory protocols (i.e., within days following radiation exposures). The components of an automated cytogenetic biodosimetry laboratory include blood collection kits for sample shipment, a cell viability analyzer, a robotic liquid handler, an automated metaphase harvester, a metaphase spreader, high-throughput slide stainer and coverslipper, a high-throughput metaphase finder, multiple satellite chromosome-aberration analysis systems, and a computerized sample-tracking system. Laboratory automation using commercially available, off-the-shelf technologies, customized technology integration, and implementation of a laboratory information management system (LIMS) for cytogenetic analysis will significantly increase throughput. This paper focuses on our efforts to eliminate data-transcription errors, increase efficiency, and maintain samples' positive chain-of-custody by sample tracking during sample processing and data analysis. This sample-tracking system represents a 'beta' version, which can be modeled elsewhere in a cytogenetic biodosimetry laboratory, and includes a customized LIMS with a central server, personal computer workstations, barcode printers, fixed station and wireless hand-held devices to scan barcodes at various critical steps, and data transmission over a private intra-laboratory computer network. Our studies will improve diagnostic biodosimetry response, aid confirmation of clinical triage, and

  19. A Novel Secure IoT-Based Smart Home Automation System Using a Wireless Sensor Network

    Science.gov (United States)

    Pirbhulal, Sandeep; Zhang, Heye; E Alahi, Md Eshrat; Ghayvat, Hemant; Mukhopadhyay, Subhas Chandra; Zhang, Yuan-Ting; Wu, Wanqing

    2016-01-01

    Wireless sensor networks (WSNs) provide noteworthy benefits over traditional approaches for several applications, including smart homes, healthcare, environmental monitoring, and homeland security. WSNs are integrated with the Internet Protocol (IP) to develop the Internet of Things (IoT) for connecting everyday life objects to the internet. Hence, major challenges of WSNs include: (i) how to efficiently utilize small size and low-power nodes to implement security during data transmission among several sensor nodes; (ii) how to resolve security issues associated with the harsh and complex environmental conditions during data transmission over a long coverage range. In this study, a secure IoT-based smart home automation system was developed. To facilitate energy-efficient data encryption, a method namely Triangle Based Security Algorithm (TBSA) based on efficient key generation mechanism was proposed. The proposed TBSA in integration of the low power Wi-Fi were included in WSNs with the Internet to develop a novel IoT-based smart home which could provide secure data transmission among several associated sensor nodes in the network over a long converge range. The developed IoT based system has outstanding performance by fulfilling all the necessary security requirements. The experimental results showed that the proposed TBSA algorithm consumed less energy in comparison with some existing methods. PMID:28042831

  20. A Novel Secure IoT-Based Smart Home Automation System Using a Wireless Sensor Network.

    Science.gov (United States)

    Pirbhulal, Sandeep; Zhang, Heye; E Alahi, Md Eshrat; Ghayvat, Hemant; Mukhopadhyay, Subhas Chandra; Zhang, Yuan-Ting; Wu, Wanqing

    2016-12-30

    Wireless sensor networks (WSNs) provide noteworthy benefits over traditional approaches for several applications, including smart homes, healthcare, environmental monitoring, and homeland security. WSNs are integrated with the Internet Protocol (IP) to develop the Internet of Things (IoT) for connecting everyday life objects to the internet. Hence, major challenges of WSNs include: (i) how to efficiently utilize small size and low-power nodes to implement security during data transmission among several sensor nodes; (ii) how to resolve security issues associated with the harsh and complex environmental conditions during data transmission over a long coverage range. In this study, a secure IoT-based smart home automation system was developed. To facilitate energy-efficient data encryption, a method namely Triangle Based Security Algorithm (TBSA) based on efficient key generation mechanism was proposed. The proposed TBSA in integration of the low power Wi-Fi were included in WSNs with the Internet to develop a novel IoT-based smart home which could provide secure data transmission among several associated sensor nodes in the network over a long converge range. The developed IoT based system has outstanding performance by fulfilling all the necessary security requirements. The experimental results showed that the proposed TBSA algorithm consumed less energy in comparison with some existing methods.

  1. A Novel Secure IoT-Based Smart Home Automation System Using a Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Sandeep Pirbhulal

    2016-12-01

    Full Text Available Wireless sensor networks (WSNs provide noteworthy benefits over traditional approaches for several applications, including smart homes, healthcare, environmental monitoring, and homeland security. WSNs are integrated with the Internet Protocol (IP to develop the Internet of Things (IoT for connecting everyday life objects to the internet. Hence, major challenges of WSNs include: (i how to efficiently utilize small size and low-power nodes to implement security during data transmission among several sensor nodes; (ii how to resolve security issues associated with the harsh and complex environmental conditions during data transmission over a long coverage range. In this study, a secure IoT-based smart home automation system was developed. To facilitate energy-efficient data encryption, a method namely Triangle Based Security Algorithm (TBSA based on efficient key generation mechanism was proposed. The proposed TBSA in integration of the low power Wi-Fi were included in WSNs with the Internet to develop a novel IoT-based smart home which could provide secure data transmission among several associated sensor nodes in the network over a long converge range. The developed IoT based system has outstanding performance by fulfilling all the necessary security requirements. The experimental results showed that the proposed TBSA algorithm consumed less energy in comparison with some existing methods.

  2. Performance analysis of automated evaluation of Crithidia luciliae-based indirect immunofluorescence tests in a routine setting - strengths and weaknesses.

    Science.gov (United States)

    Hormann, Wymke; Hahn, Melanie; Gerlach, Stefan; Hochstrate, Nicola; Affeldt, Kai; Giesen, Joyce; Fechner, Kai; Damoiseaux, Jan G M C

    2017-11-27

    Antibodies directed against dsDNA are a highly specific diagnostic marker for the presence of systemic lupus erythematosus and of particular importance in its diagnosis. To assess anti-dsDNA antibodies, the Crithidia luciliae-based indirect immunofluorescence test (CLIFT) is one of the assays considered to be the best choice. To overcome the drawback of subjective result interpretation that inheres indirect immunofluorescence assays in general, automated systems have been introduced into the market during the last years. Among these systems is the EUROPattern Suite, an advanced automated fluorescence microscope equipped with different software packages, capable of automated pattern interpretation and result suggestion for ANA, ANCA and CLIFT analysis. We analyzed the performance of the EUROPattern Suite with its automated fluorescence interpretation for CLIFT in a routine setting, reflecting the everyday life of a diagnostic laboratory. Three hundred and twelve consecutive samples were collected, sent to the Central Diagnostic Laboratory of the Maastricht University Medical Centre with a request for anti-dsDNA analysis over a period of 7 months. Agreement between EUROPattern assay analysis and the visual read was 93.3%. Sensitivity and specificity were 94.1% and 93.2%, respectively. The EUROPattern Suite performed reliably and greatly supported result interpretation. Automated image acquisition is readily performed and automated image classification gives a reliable recommendation for assay evaluation to the operator. The EUROPattern Suite optimizes workflow and contributes to standardization between different operators or laboratories.

  3. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  4. Internet of things and automation of imaging: beyond representationalism

    Directory of Open Access Journals (Sweden)

    2016-09-01

    Full Text Available It is no doubt that the production of digital imagery invites for the major update of theoretical apparatus: what up until now was perceived solely or primarily as the stable representation of the world gives way to the image understood in terms of “the continuous actualization of networked data” or “networked terminal.” In my article I would like to argue that analysis of this new visual environment should not be limited to the procedures of data processing. What also invites serious investigation is acknowledging the reliance of contemporary media ecology on wireless communication which according to Adrian Mackenzie functions as “prepositions (‘at,’ ‘in,’ ‘with,’ by’, ‘between,’ ‘near,’ etc in the grammar of contemporary media” It seems especially important in the case of the imagery accompanying some instances of internet of things, where the considerable part of networked imagery is produced in a fully automated and machinic way. This crowdsourced air pollution monitoring platform consists of networked sensors transmitting signals and data which are then visualized as graphs and maps through the IoT service provider, Xively.

  5. Qualitative analysis and control of complex neural networks with delays

    CERN Document Server

    Wang, Zhanshan; Zheng, Chengde

    2016-01-01

    This book focuses on the stability of the dynamical neural system, synchronization of the coupling neural system and their applications in automation control and electrical engineering. The redefined concept of stability, synchronization and consensus are adopted to provide a better explanation of the complex neural network. Researchers in the fields of dynamical systems, computer science, electrical engineering and mathematics will benefit from the discussions on complex systems. The book will also help readers to better understand the theory behind the control technique and its design.

  6. Flory-Stockmayer analysis on reprocessable polymer networks

    Science.gov (United States)

    Li, Lingqiao; Chen, Xi; Jin, Kailong; Torkelson, John

    Reprocessable polymer networks can undergo structure rearrangement through dynamic chemistries under proper conditions, making them a promising candidate for recyclable crosslinked materials, e.g. tires. This research field has been focusing on various chemistries. However, there has been lacking of an essential physical theory explaining the relationship between abundancy of dynamic linkages and reprocessability. Based on the classical Flory-Stockmayer analysis on network gelation, we developed a similar analysis on reprocessable polymer networks to quantitatively predict the critical condition for reprocessability. Our theory indicates that it is unnecessary for all bonds to be dynamic to make the resulting network reprocessable. As long as there is no percolated permanent network in the system, the material can fully rearrange. To experimentally validate our theory, we used a thiol-epoxy network model system with various dynamic linkage compositions. The stress relaxation behavior of resulting materials supports our theoretical prediction: only 50 % of linkages between crosslinks need to be dynamic for a tri-arm network to be reprocessable. Therefore, this analysis provides the first fundamental theoretical platform for designing and evaluating reprocessable polymer networks. We thank McCormick Research Catalyst Award Fund and ISEN cluster fellowship (L. L.) for funding support.

  7. Systems integration (automation system). System integration (automation system)

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, K; Komori, T; Fukuma, Y; Oikawa, M [Nippon Steal Corp., Tokyo (Japan)

    1991-09-26

    This paper introduces business activities on an automation systems integration (SI) started by a company in July,1988, and describes the SI concepts. The business activities include, with the CIM (unified production carried out on computers) and AMENITY (living environment) as the mainstays, a single responsibility construction ranging from consultation on structuring optimal systems for processing and assembling industries and intelligent buildings to system design, installation and after-sales services. With an SI standing on users {prime} position taken most importantly, the business starts from a planning and consultation under close coordination. On the conceptual basis of structuring optimal systems using the ompany {prime}s affluent know-hows and tools and adapting and applying with multi-vendors, open networks, centralized and distributed systems, the business is promoted with the accumulated technologies capable of realizing artificial intelligence and neural networks in its background, and supported with highly valuable business results in the past. 10 figs., 1 tab.

  8. Applications of social media and social network analysis

    CERN Document Server

    Kazienko, Przemyslaw

    2015-01-01

    This collection of contributed chapters demonstrates a wide range of applications within two overlapping research domains: social media analysis and social network analysis. Various methodologies were utilized in the twelve individual chapters including static, dynamic and real-time approaches to graph, textual and multimedia data analysis. The topics apply to reputation computation, emotion detection, topic evolution, rumor propagation, evaluation of textual opinions, friend ranking, analysis of public transportation networks, diffusion in dynamic networks, analysis of contributors to commun

  9. Automated general temperature correction method for dielectric soil moisture sensors

    Science.gov (United States)

    Kapilaratne, R. G. C. Jeewantinie; Lu, Minjiao

    2017-08-01

    An effective temperature correction method for dielectric sensors is important to ensure the accuracy of soil water content (SWC) measurements of local to regional-scale soil moisture monitoring networks. These networks are extensively using highly temperature sensitive dielectric sensors due to their low cost, ease of use and less power consumption. Yet there is no general temperature correction method for dielectric sensors, instead sensor or site dependent correction algorithms are employed. Such methods become ineffective at soil moisture monitoring networks with different sensor setups and those that cover diverse climatic conditions and soil types. This study attempted to develop a general temperature correction method for dielectric sensors which can be commonly used regardless of the differences in sensor type, climatic conditions and soil type without rainfall data. In this work an automated general temperature correction method was developed by adopting previously developed temperature correction algorithms using time domain reflectometry (TDR) measurements to ThetaProbe ML2X, Stevens Hydra probe II and Decagon Devices EC-TM sensor measurements. The rainy day effects removal procedure from SWC data was automated by incorporating a statistical inference technique with temperature correction algorithms. The temperature correction method was evaluated using 34 stations from the International Soil Moisture Monitoring Network and another nine stations from a local soil moisture monitoring network in Mongolia. Soil moisture monitoring networks used in this study cover four major climates and six major soil types. Results indicated that the automated temperature correction algorithms developed in this study can eliminate temperature effects from dielectric sensor measurements successfully even without on-site rainfall data. Furthermore, it has been found that actual daily average of SWC has been changed due to temperature effects of dielectric sensors with a

  10. Automated video surveillance: teaching an old dog new tricks

    Science.gov (United States)

    McLeod, Alastair

    1993-12-01

    The automated video surveillance market is booming with new players, new systems, new hardware and software, and an extended range of applications. This paper reviews available technology, and describes the features required for a good automated surveillance system. Both hardware and software are discussed. An overview of typical applications is also given. A shift towards PC-based hybrid systems, use of parallel processing, neural networks, and exploitation of modern telecomms are introduced, highlighting the evolution modern video surveillance systems.

  11. Validation of network communicability metrics for the analysis of brain structural networks.

    Directory of Open Access Journals (Sweden)

    Jennifer Andreotti

    Full Text Available Computational network analysis provides new methods to analyze the brain's structural organization based on diffusion imaging tractography data. Networks are characterized by global and local metrics that have recently given promising insights into diagnosis and the further understanding of psychiatric and neurologic disorders. Most of these metrics are based on the idea that information in a network flows along the shortest paths. In contrast to this notion, communicability is a broader measure of connectivity which assumes that information could flow along all possible paths between two nodes. In our work, the features of network metrics related to communicability were explored for the first time in the healthy structural brain network. In addition, the sensitivity of such metrics was analysed using simulated lesions to specific nodes and network connections. Results showed advantages of communicability over conventional metrics in detecting densely connected nodes as well as subsets of nodes vulnerable to lesions. In addition, communicability centrality was shown to be widely affected by the lesions and the changes were negatively correlated with the distance from lesion site. In summary, our analysis suggests that communicability metrics that may provide an insight into the integrative properties of the structural brain network and that these metrics may be useful for the analysis of brain networks in the presence of lesions. Nevertheless, the interpretation of communicability is not straightforward; hence these metrics should be used as a supplement to the more standard connectivity network metrics.

  12. Evaluation of damping estimates by automated Operational Modal Analysis for offshore wind turbine tower vibrations

    DEFF Research Database (Denmark)

    Bajrić, Anela; Høgsberg, Jan Becker; Rüdinger, Finn

    2018-01-01

    Reliable predictions of the lifetime of offshore wind turbine structures are influenced by the limited knowledge concerning the inherent level of damping during downtime. Error measures and an automated procedure for covariance driven Operational Modal Analysis (OMA) techniques has been proposed....... In order to obtain algorithmic independent answers, three identification techniques are compared: Eigensystem Realization Algorithm (ERA), covariance driven Stochastic Subspace Identification (COV-SSI) and the Enhanced Frequency Domain Decomposition (EFDD). Discrepancies between automated identification...... techniques are discussed and illustrated with respect to signal noise, measurement time, vibration amplitudes and stationarity of the ambient response. The best bias-variance error trade-off of damping estimates is obtained by the COV-SSI. The proposed automated procedure is validated by real vibration...

  13. Automated analysis of small animal PET studies through deformable registration to an atlas

    NARCIS (Netherlands)

    Gutierrez, Daniel F.; Zaidi, Habib

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of

  14. PyPathway: Python Package for Biological Network Analysis and Visualization.

    Science.gov (United States)

    Xu, Yang; Luo, Xiao-Chun

    2018-05-01

    Life science studies represent one of the biggest generators of large data sets, mainly because of rapid sequencing technological advances. Biological networks including interactive networks and human curated pathways are essential to understand these high-throughput data sets. Biological network analysis offers a method to explore systematically not only the molecular complexity of a particular disease but also the molecular relationships among apparently distinct phenotypes. Currently, several packages for Python community have been developed, such as BioPython and Goatools. However, tools to perform comprehensive network analysis and visualization are still needed. Here, we have developed PyPathway, an extensible free and open source Python package for functional enrichment analysis, network modeling, and network visualization. The network process module supports various interaction network and pathway databases such as Reactome, WikiPathway, STRING, and BioGRID. The network analysis module implements overrepresentation analysis, gene set enrichment analysis, network-based enrichment, and de novo network modeling. Finally, the visualization and data publishing modules enable users to share their analysis by using an easy web application. For package availability, see the first Reference.

  15. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  16. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  17. User’s manual for the Automated Data Assurance and Management application developed for quality control of Everglades Depth Estimation Network water-level data

    Science.gov (United States)

    Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul

    2016-09-29

    The generation of Everglades Depth Estimation Network (EDEN) daily water-level and water-depth maps is dependent on high quality real-time data from over 240 water-level stations. To increase the accuracy of the daily water-surface maps, the Automated Data Assurance and Management (ADAM) tool was created by the U.S. Geological Survey as part of Greater Everglades Priority Ecosystems Science. The ADAM tool is used to provide accurate quality-assurance review of the real-time data from the EDEN network and allows estimation or replacement of missing or erroneous data. This user’s manual describes how to install and operate the ADAM software. File structure and operation of the ADAM software is explained using examples.

  18. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Understanding Autonomy in Future Systems.

    Science.gov (United States)

    Schaefer, Kristin E; Chen, Jessie Y C; Szalma, James L; Hancock, P A

    2016-05-01

    We used meta-analysis to assess research concerning human trust in automation to understand the foundation upon which future autonomous systems can be built. Trust is increasingly important in the growing need for synergistic human-machine teaming. Thus, we expand on our previous meta-analytic foundation in the field of human-robot interaction to include all of automation interaction. We used meta-analysis to assess trust in automation. Thirty studies provided 164 pairwise effect sizes, and 16 studies provided 63 correlational effect sizes. The overall effect size of all factors on trust development was ḡ = +0.48, and the correlational effect was [Formula: see text]  = +0.34, each of which represented medium effects. Moderator effects were observed for the human-related (ḡ  = +0.49; [Formula: see text] = +0.16) and automation-related (ḡ = +0.53; [Formula: see text] = +0.41) factors. Moderator effects specific to environmental factors proved insufficient in number to calculate at this time. Findings provide a quantitative representation of factors influencing the development of trust in automation as well as identify additional areas of needed empirical research. This work has important implications to the enhancement of current and future human-automation interaction, especially in high-risk or extreme performance environments. © 2016, Human Factors and Ergonomics Society.

  19. The Nature and Variability of Automated Practice Alerts Derived from Electronic Health Records in a U.S. Nationwide Critical Care Research Network.

    Science.gov (United States)

    Benthin, Cody; Pannu, Sonal; Khan, Akram; Gong, Michelle

    2016-10-01

    The nature, variability, and extent of early warning clinical practice alerts derived from automated query of electronic health records (e-alerts) currently used in acute care settings for clinical care or research is unknown. To describe e-alerts in current use in acute care settings at medical centers participating in a nationwide critical care research network. We surveyed investigators at 38 institutions involved in the National Institutes of Health-funded Clinical Trials Network for the Prevention and Early Treatment of Acute Lung Injury (PETAL) for quantitative and qualitative analysis. Thirty sites completed the survey (79% response rate). All sites used electronic health record systems. Epic Systems was used at 56% of sites; the others used alternate commercially available vendors or homegrown systems. Respondents at 57% of sites represented in this survey used e-alerts. All but 1 of these 17 sites used an e-alert for early detection of sepsis-related syndromes, and 35% used an e-alert for pneumonia. E-alerts were triggered by abnormal laboratory values (37%), vital signs (37%), or radiology reports (15%) and were used about equally for clinical decision support and research. Only 59% of sites with e-alerts have evaluated them either for accuracy or for validity. A majority of the research network sites participating in this survey use e-alerts for early notification of potential threats to hospitalized patients; however, there was significant variability in the nature of e-alerts between institutions. Use of one common electronic health record vendor at more than half of the participating sites suggests that it may be possible to standardize e-alerts across multiple sites in research networks, particularly among sites using the same medical record platform.

  20. Scoring of radiation-induced micronuclei in cytokinesis-blocked human lymphocytes by automated image analysis

    International Nuclear Information System (INIS)

    Verhaegen, F.; Seuntjens, J.; Thierens, H.

    1994-01-01

    The micronucleus assay in human lymphocytes is, at present, frequently used to assess chromosomal damage caused by ionizing radiation or mutagens. Manual scoring of micronuclei (MN) by trained personnel is very time-consuming, tiring work, and the results depend on subjective interpretation of scoring criteria. More objective scoring can be accomplished only if the test can be automated. Furthermore, an automated system allows scoring of large numbers of cells, thereby increasing the statistical significance of the results. This is of special importance for screening programs for low doses of chromosome-damaging agents. In this paper, the first results of our effort to automate the micronucleus assay with an image-analysis system are represented. The method we used is described in detail, and the results are compared to those of other groups. Our system is able to detect 88% of the binucleated lymphocytes on the slides. The procedure consists of a fully automated localization of binucleated cells and counting of the MN within these cells, followed by a simple and fast manual operation in which the false positives are removed. Preliminary measurements for blood samples irradiated with a dose of 1 Gy X-rays indicate that the automated system can find 89% ± 12% of the micronuclei within the binucleated cells compared to a manual screening. 18 refs., 8 figs., 1 tab

  1. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  2. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  3. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  4. Identifying changes in the support networks of end-of-life carers using social network analysis.

    Science.gov (United States)

    Leonard, Rosemary; Horsfall, Debbie; Noonan, Kerrie

    2015-06-01

    End-of-life caring is often associated with reduced social networks for both the dying person and for the carer. However, those adopting a community participation and development approach, see the potential for the expansion and strengthening of networks. This paper uses Knox, Savage and Harvey's definitions of three generations social network analysis to analyse the caring networks of people with a terminal illness who are being cared for at home and identifies changes in these caring networks that occurred over the period of caring. Participatory network mapping of initial and current networks was used in nine focus groups. The analysis used key concepts from social network analysis (size, density, transitivity, betweenness and local clustering) together with qualitative analyses of the group's reflections on the maps. The results showed an increase in the size of the networks and that ties between the original members of the network strengthened. The qualitative data revealed the importance between core and peripheral network members and the diverse contributions of the network members. The research supports the value of third generation social network analysis and the potential for end-of-life caring to build social capital. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  6. Request-Driven Schedule Automation for the Deep Space Network

    Science.gov (United States)

    Johnston, Mark D.; Tran, Daniel; Arroyo, Belinda; Call, Jared; Mercado, Marisol

    2010-01-01

    The DSN Scheduling Engine (DSE) has been developed to increase the level of automated scheduling support available to users of NASA s Deep Space Network (DSN). We have adopted a request-driven approach to DSN scheduling, in contrast to the activity-oriented approach used up to now. Scheduling requests allow users to declaratively specify patterns and conditions on their DSN service allocations, including timing, resource requirements, gaps, overlaps, time linkages among services, repetition, priorities, and a wide range of additional factors and preferences. The DSE incorporates a model of the key constraints and preferences of the DSN scheduling domain, along with algorithms to expand scheduling requests into valid resource allocations, to resolve schedule conflicts, and to repair unsatisfied requests. We use time-bounded systematic search with constraint relaxation to return nearby solutions if exact ones cannot be found, where the relaxation options and order are under user control. To explore the usability aspects of our approach we have developed a graphical user interface incorporating some crucial features to make it easier to work with complex scheduling requests. Among these are: progressive revelation of relevant detail, immediate propagation and visual feedback from a user s decisions, and a meeting calendar metaphor for repeated patterns of requests. Even as a prototype, the DSE has been deployed and adopted as the initial step in building the operational DSN schedule, thus representing an important initial validation of our overall approach. The DSE is a core element of the DSN Service Scheduling Software (S(sup 3)), a web-based collaborative scheduling system now under development for deployment to all DSN users.

  7. Network Analysis in Community Psychology: Looking Back, Looking Forward.

    Science.gov (United States)

    Neal, Zachary P; Neal, Jennifer Watling

    2017-09-01

    Network analysis holds promise for community psychology given the field's aim to understand the interplay between individuals and their social contexts. Indeed, because network analysis focuses explicitly on patterns of relationships between actors, its theories and methods are inherently extra-individual in nature and particularly well suited to characterizing social contexts. But, to what extent has community psychology taken advantage of this network analysis as a tool for capturing context? To answer these questions, this study provides a review of the use network analysis in articles published in American Journal of Community Psychology. Looking back, we describe and summarize the ways that network analysis has been employed in community psychology research to understand the range of ways community psychologists have found the technique helpful. Looking forward and paying particular attention to analytic issues identified in past applications, we provide some recommendations drawn from the network analysis literature to facilitate future applications of network analysis in community psychology. © 2017 The Authors. American Journal of Community Psychology published by Wiley Periodicals, Inc. on behalf of Society for Community Research and Action.

  8. Policy-based Network Management in Home Area Networks: Interim Test Results

    OpenAIRE

    Ibrahim Rana, Annie; Ó Foghlú, Mícheál

    2009-01-01

    This paper argues that Home Area Networks (HANs) are a good candidate for advanced network management automation techniques, such as Policy-Based Network Management (PBNM). What is proposed is a simple use of policy based network management to introduce some level of Quality of Service (QoS) and Security management in the HAN, whilst hiding this complexity from the home user. In this paper we have presented the interim test results of our research experiments (based on a scenario) using the H...

  9. Automated analysis of heterogeneous carbon nanostructures by high-resolution electron microscopy and on-line image processing

    International Nuclear Information System (INIS)

    Toth, P.; Farrer, J.K.; Palotas, A.B.; Lighty, J.S.; Eddings, E.G.

    2013-01-01

    High-resolution electron microscopy is an efficient tool for characterizing heterogeneous nanostructures; however, currently the analysis is a laborious and time-consuming manual process. In order to be able to accurately and robustly quantify heterostructures, one must obtain a statistically high number of micrographs showing images of the appropriate sub-structures. The second step of analysis is usually the application of digital image processing techniques in order to extract meaningful structural descriptors from the acquired images. In this paper it will be shown that by applying on-line image processing and basic machine vision algorithms, it is possible to fully automate the image acquisition step; therefore, the number of acquired images in a given time can be increased drastically without the need for additional human labor. The proposed automation technique works by computing fields of structural descriptors in situ and thus outputs sets of the desired structural descriptors in real-time. The merits of the method are demonstrated by using combustion-generated black carbon samples. - Highlights: ► The HRTEM analysis of heterogeneous nanostructures is a tedious manual process. ► Automatic HRTEM image acquisition and analysis can improve data quantity and quality. ► We propose a method based on on-line image analysis for the automation of HRTEM image acquisition. ► The proposed method is demonstrated using HRTEM images of soot particles

  10. Automation process for morphometric analysis of volumetric CT data from pulmonary vasculature in rats.

    Science.gov (United States)

    Shingrani, Rahul; Krenz, Gary; Molthen, Robert

    2010-01-01

    With advances in medical imaging scanners, it has become commonplace to generate large multidimensional datasets. These datasets require tools for a rapid, thorough analysis. To address this need, we have developed an automated algorithm for morphometric analysis incorporating A Visualization Workshop computational and image processing libraries for three-dimensional segmentation, vascular tree generation and structural hierarchical ordering with a two-stage numeric optimization procedure for estimating vessel diameters. We combine this new technique with our mathematical models of pulmonary vascular morphology to quantify structural and functional attributes of lung arterial trees. Our physiological studies require repeated measurements of vascular structure to determine differences in vessel biomechanical properties between animal models of pulmonary disease. Automation provides many advantages including significantly improved speed and minimized operator interaction and biasing. The results are validated by comparison with previously published rat pulmonary arterial micro-CT data analysis techniques, in which vessels were manually mapped and measured using intense operator intervention. Published by Elsevier Ireland Ltd.

  11. Industrial entrepreneurial network: Structural and functional analysis

    Science.gov (United States)

    Medvedeva, M. A.; Davletbaev, R. H.; Berg, D. B.; Nazarova, J. J.; Parusheva, S. S.

    2016-12-01

    Structure and functioning of two model industrial entrepreneurial networks are investigated in the present paper. One of these networks is forming when implementing an integrated project and consists of eight agents, which interact with each other and external environment. The other one is obtained from the municipal economy and is based on the set of the 12 real business entities. Analysis of the networks is carried out on the basis of the matrix of mutual payments aggregated over the certain time period. The matrix is created by the methods of experimental economics. Social Network Analysis (SNA) methods and instruments were used in the present research. The set of basic structural characteristics was investigated: set of quantitative parameters such as density, diameter, clustering coefficient, different kinds of centrality, and etc. They were compared with the random Bernoulli graphs of the corresponding size and density. Discovered variations of random and entrepreneurial networks structure are explained by the peculiarities of agents functioning in production network. Separately, were identified the closed exchange circuits (cyclically closed contours of graph) forming an autopoietic (self-replicating) network pattern. The purpose of the functional analysis was to identify the contribution of the autopoietic network pattern in its gross product. It was found that the magnitude of this contribution is more than 20%. Such value allows using of the complementary currency in order to stimulate economic activity of network agents.

  12. Social network analysis: Presenting an underused method for nursing research.

    Science.gov (United States)

    Parnell, James Michael; Robinson, Jennifer C

    2018-06-01

    This paper introduces social network analysis as a versatile method with many applications in nursing research. Social networks have been studied for years in many social science fields. The methods continue to advance but remain unknown to most nursing scholars. Discussion paper. English language and interpreted literature was searched from Ovid Healthstar, CINAHL, PubMed Central, Scopus and hard copy texts from 1965 - 2017. Social network analysis first emerged in nursing literature in 1995 and appears minimally through present day. To convey the versatility and applicability of social network analysis in nursing, hypothetical scenarios are presented. The scenarios are illustrative of three approaches to social network analysis and include key elements of social network research design. The methods of social network analysis are underused in nursing research, primarily because they are unknown to most scholars. However, there is methodological flexibility and epistemological versatility capable of supporting quantitative and qualitative research. The analytic techniques of social network analysis can add new insight into many areas of nursing inquiry, especially those influenced by cultural norms. Furthermore, visualization techniques associated with social network analysis can be used to generate new hypotheses. Social network analysis can potentially uncover findings not accessible through methods commonly used in nursing research. Social networks can be analysed based on individual-level attributes, whole networks and subgroups within networks. Computations derived from social network analysis may stand alone to answer a research question or incorporated as variables into robust statistical models. © 2018 John Wiley & Sons Ltd.

  13. 96 THE EFFECT OF AUTOMATED TELLER MACHINES ON BANKS ...

    African Journals Online (AJOL)

    reduces the number of human deployment by banks thereby reducing cost of operations. ... United States (PLUS and CIRRUS) drooped their long standing opposition to allowing ..... Automated teller machine network pricing – A review of the.

  14. US Department of Energy Automated Transportation Management System

    International Nuclear Information System (INIS)

    Portsmouth, J.H.

    1994-01-01

    The U.S. Department of Energy (DOE) Transportation Management Division (TMD) is responsible for managing its various programs via a diverse combination of Government-Owned/Contractor-Operated facilities. TMD is seeking to update it automation capabilities in capturing and processing DOE transportation information. TMD's Transportation Information Network (TIN) is an attempt to bring together transportation management, shipment tracking, research activities and software products in various stages of development. The TMD's Automated Transportation Management System (ATMS) proposes to assist the DOE and its contractors in performing their daily transportation management activities and to assist the DOE Environmental Management Division in its waste management responsibilities throughout the DOE complex. The ATMS system will center about the storage, handling and documentation involved in the environmental clean-up of DOE sites. Waste shipments will be moved to approved Treatment, Storage and Disposal (TSD) facilities and/or nuclear material repositories. An additional investment in shipping samples to analytical laboratories also involves packaging and documentation according to all applicable U.S. Department of Transportation (DOT) or International Air Transport Association (IATA) regulations. The most immediate goal of effectively managing DOE transportation management functions during the 1990's is an increase in automation capabilities of the DOE and its contractors. Subject-matter experts from various DOE site locations will be brought together to develop and refine these capabilities through the maximum use of computer applications. A major part of this effort will be the identification of the most economical modes of transportation and enhanced management reporting capabilities for transportation analysis. The ATMS system will also provide for increased strategic and shipment analysis during the 1990's and beyond in support of the DOE environmental mission

  15. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    Science.gov (United States)

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  16. Efficient Reactive Power Compensation Algorithm for Distribution Network

    Directory of Open Access Journals (Sweden)

    J. Jerome

    2017-12-01

    Full Text Available The use of automation and energy efficient equipment with electronic control would greatly improve industrial production.  These new devices are more sensitive to supply voltage deviation and the characteristics of the power system that was previously ignored are now very important. Hence the benefits of distribution automation have been widely acknowledged in recent years. This paper proposes an efficient load flow solution technique extended to find optimum location for reactive power compensation and network reconfiguration for planning and day-to-day operation of distribution networks.  This is required as a part of the distribution automation system (DAS for taking various control and operation decisions.  The method exploits the radial nature of the network and uses forward and backward propagation technique to calculate branch currents and node voltages.  The proposed method has been tested to analyze several practical distribution networks of various voltage levels and also having high R/X ratio.

  17. 3rd International Conference on Network Analysis

    CERN Document Server

    Kalyagin, Valery; Pardalos, Panos

    2014-01-01

    This volume compiles the major results of conference participants from the "Third International Conference in Network Analysis" held at the Higher School of Economics, Nizhny Novgorod in May 2013, with the aim to initiate further joint research among different groups. The contributions in this book cover a broad range of topics relevant to the theory and practice of network analysis, including the reliability of complex networks, software, theory, methodology, and applications.  Network analysis has become a major research topic over the last several years. The broad range of applications that can be described and analyzed by means of a network has brought together researchers, practitioners from numerous fields such as operations research, computer science, transportation, energy, biomedicine, computational neuroscience and social sciences. In addition, new approaches and computer environments such as parallel computing, grid computing, cloud computing, and quantum computing have helped to solve large scale...

  18. NAPS: Network Analysis of Protein Structures

    Science.gov (United States)

    Chakrabarty, Broto; Parekh, Nita

    2016-01-01

    Traditionally, protein structures have been analysed by the secondary structure architecture and fold arrangement. An alternative approach that has shown promise is modelling proteins as a network of non-covalent interactions between amino acid residues. The network representation of proteins provide a systems approach to topological analysis of complex three-dimensional structures irrespective of secondary structure and fold type and provide insights into structure-function relationship. We have developed a web server for network based analysis of protein structures, NAPS, that facilitates quantitative and qualitative (visual) analysis of residue–residue interactions in: single chains, protein complex, modelled protein structures and trajectories (e.g. from molecular dynamics simulations). The user can specify atom type for network construction, distance range (in Å) and minimal amino acid separation along the sequence. NAPS provides users selection of node(s) and its neighbourhood based on centrality measures, physicochemical properties of amino acids or cluster of well-connected residues (k-cliques) for further analysis. Visual analysis of interacting domains and protein chains, and shortest path lengths between pair of residues are additional features that aid in functional analysis. NAPS support various analyses and visualization views for identifying functional residues, provide insight into mechanisms of protein folding, domain-domain and protein–protein interactions for understanding communication within and between proteins. URL:http://bioinf.iiit.ac.in/NAPS/. PMID:27151201

  19. NET-2 Network Analysis Program

    International Nuclear Information System (INIS)

    Malmberg, A.F.

    1974-01-01

    The NET-2 Network Analysis Program is a general purpose digital computer program which solves the nonlinear time domain response and the linearized small signal frequency domain response of an arbitrary network of interconnected components. NET-2 is capable of handling a variety of components and has been applied to problems in several engineering fields, including electronic circuit design and analysis, missile flight simulation, control systems, heat flow, fluid flow, mechanical systems, structural dynamics, digital logic, communications network design, solid state device physics, fluidic systems, and nuclear vulnerability due to blast, thermal, gamma radiation, neutron damage, and EMP effects. Network components may be selected from a repertoire of built-in models or they may be constructed by the user through appropriate combinations of mathematical, empirical, and topological functions. Higher-level components may be defined by subnetworks composed of any combination of user-defined components and built-in models. The program provides a modeling capability to represent and intermix system components on many levels, e.g., from hole and electron spatial charge distributions in solid state devices through discrete and integrated electronic components to functional system blocks. NET-2 is capable of simultaneous computation in both the time and frequency domain, and has statistical and optimization capability. Network topology may be controlled as a function of the network solution. (U.S.)

  20. Artificial neural networks for plasma spectroscopy analysis

    International Nuclear Information System (INIS)

    Morgan, W.L.; Larsen, J.T.; Goldstein, W.H.

    1992-01-01

    Artificial neural networks have been applied to a variety of signal processing and image recognition problems. Of the several common neural models the feed-forward, back-propagation network is well suited for the analysis of scientific laboratory data, which can be viewed as a pattern recognition problem. The authors present a discussion of the basic neural network concepts and illustrate its potential for analysis of experiments by applying it to the spectra of laser produced plasmas in order to obtain estimates of electron temperatures and densities. Although these are high temperature and density plasmas, the neural network technique may be of interest in the analysis of the low temperature and density plasmas characteristic of experiments and devices in gaseous electronics